PingVortex commited on
Commit
1a76289
ยท
verified ยท
1 Parent(s): 2c5af53

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -23
README.md CHANGED
@@ -20,21 +20,17 @@ pipeline_tag: text-generation
20
  library_name: transformers
21
  ---
22
 
23
- # Youtube shorts comment generator ๐Ÿง  (I couldn't come up with a more original name)
24
-
25
- A **fine-tuned DistilGPT2 model** trained on 1.4M+ YouTube Shorts comments - the perfect language model for generating cursed internet humor, emoji spam, and authentic YouTube degeneracy.
26
 
27
  - Base model: [distilgpt2](https://huggingface.co/distilgpt2)
28
  - Trained on: [YouTube Shorts Comments Dataset](https://huggingface.co/datasets/PingVortex/Youtube_shorts_comments)
29
- - Creator: [PingVortex](https://github.com/PingVortex)
30
 
31
- ## Model Details ๐Ÿ”ฅ
32
 
33
  - **Parameters**: 82M (DistilGPT2 architecture)
34
  - **Training Data**: 1,475,500 YouTube Shorts comments
35
- - **Special Skills**: Emoji generation, broken English, random character generation
36
 
37
- ## Usage Example ๐Ÿ
38
 
39
  ```python
40
  from transformers import pipeline
@@ -48,24 +44,10 @@ print(output[0]['generated_text'])
48
  *Sample output:*
49
  `"When you see a Sigma edit: ๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚ The white one on the last pic?๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜…๐Ÿ˜…๐Ÿ˜…๐Ÿ˜Š๐Ÿ˜Š๐Ÿ˜Š๐Ÿ˜…๐Ÿ˜ฎ๐Ÿ˜ฎ๐Ÿ˜…"`
50
 
51
- ## Training Info โš™๏ธ
52
 
53
  - **Epochs**: 1
54
  - **Batch Size**: 8
55
  - **Hardware**: Google Colab T4 GPU
56
  - **Training Time**: ~2 hours
57
- - **Loss**: 0.24
58
-
59
- ## Ethical Considerations โš ๏ธ
60
-
61
- This model may generate:
62
- - Extreme emoji spam (๐Ÿ”ฅ๐Ÿ’€๐Ÿคฃ)
63
- - Nonsensical combinations
64
- - Mild brain damage
65
- - Occasional coherent text
66
-
67
- Use responsibly (or irresponsibly, we don't judge).
68
-
69
- ## License ๐Ÿ“œ
70
-
71
- **CC0 1.0 Universal** (Public Domain)
 
20
  library_name: transformers
21
  ---
22
 
23
+ # Youtube shorts comment generator
 
 
24
 
25
  - Base model: [distilgpt2](https://huggingface.co/distilgpt2)
26
  - Trained on: [YouTube Shorts Comments Dataset](https://huggingface.co/datasets/PingVortex/Youtube_shorts_comments)
 
27
 
28
+ ## Model Details
29
 
30
  - **Parameters**: 82M (DistilGPT2 architecture)
31
  - **Training Data**: 1,475,500 YouTube Shorts comments
 
32
 
33
+ ## Usage Example
34
 
35
  ```python
36
  from transformers import pipeline
 
44
  *Sample output:*
45
  `"When you see a Sigma edit: ๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚ The white one on the last pic?๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜…๐Ÿ˜…๐Ÿ˜…๐Ÿ˜Š๐Ÿ˜Š๐Ÿ˜Š๐Ÿ˜…๐Ÿ˜ฎ๐Ÿ˜ฎ๐Ÿ˜…"`
46
 
47
+ ## Training Info
48
 
49
  - **Epochs**: 1
50
  - **Batch Size**: 8
51
  - **Hardware**: Google Colab T4 GPU
52
  - **Training Time**: ~2 hours
53
+ - **Loss**: 0.24