docs: change example in the README
Browse files
README.md
CHANGED
@@ -6,11 +6,12 @@ More info about the project can be found in the [official GitHub repository](htt
|
|
6 |
|
7 |
### Legal notes
|
8 |
|
9 |
-
Due to incertainity about legal rights, the dataset used for training the model is not provided. I hope you'll understand. The lyrics in question have been scraped from
|
10 |
|
11 |
## Intended uses and limitations
|
12 |
|
13 |
The model is released under the [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0). You can use the raw model for lyrics generation or fine-tune it further to a downstream task.
|
|
|
14 |
|
15 |
## How to use
|
16 |
|
@@ -21,7 +22,7 @@ You can use this model directly with a pipeline for text generation. Since the g
|
|
21 |
>>> generator = pipeline('text-generation', model='lucone83/deep-metal', device=-1) # to use GPU, set device=<CUDA_device_ordinal>
|
22 |
>>> set_seed(42)
|
23 |
>>> generator(
|
24 |
-
"
|
25 |
num_return_sequences=1,
|
26 |
max_length=256,
|
27 |
min_length=128,
|
@@ -29,7 +30,7 @@ You can use this model directly with a pipeline for text generation. Since the g
|
|
29 |
top_k=0,
|
30 |
temperature=0.90
|
31 |
)
|
32 |
-
[{'generated_text': "
|
33 |
```
|
34 |
|
35 |
Of course, it's possible to play with parameters like `top_k`, `top_p`, `temperature`, `max_length` and all the other parameters included in the `generate` method. Please look at the [documentation](https://huggingface.co/transformers/main_classes/model.html?highlight=generate#transformers.generation_utils.GenerationMixin.generate) for further insights.
|
|
|
6 |
|
7 |
### Legal notes
|
8 |
|
9 |
+
Due to incertainity about legal rights, the dataset used for training the model is not provided. I hope you'll understand. The lyrics in question have been scraped from the website [DarkLyrics](http://www.darklyrics.com/) using the library [metal-parser](https://github.com/lucone83/metal-parser).
|
10 |
|
11 |
## Intended uses and limitations
|
12 |
|
13 |
The model is released under the [Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0). You can use the raw model for lyrics generation or fine-tune it further to a downstream task.
|
14 |
+
The model is capable to generate **explicit lyrics**. It is a consequence of a fine-tuning made with a dataset that contained such lyrics, which are part of the discography of many heavy metal bands. Be aware of that before you use the model, the author is **not liable for any emotional response and following consequences**.
|
15 |
|
16 |
## How to use
|
17 |
|
|
|
22 |
>>> generator = pipeline('text-generation', model='lucone83/deep-metal', device=-1) # to use GPU, set device=<CUDA_device_ordinal>
|
23 |
>>> set_seed(42)
|
24 |
>>> generator(
|
25 |
+
"End of passion play",
|
26 |
num_return_sequences=1,
|
27 |
max_length=256,
|
28 |
min_length=128,
|
|
|
30 |
top_k=0,
|
31 |
temperature=0.90
|
32 |
)
|
33 |
+
[{'generated_text': "End of passion play for you\nFrom the spiritual to the mental\nIt is hard to see it all\nBut let's not end up all be\nTill we see the fruits of our deeds\nAnd see the fruits of our suffering\nLet's start a fight for victory\nIt was our birthright\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call\nIt was our call \nLike a typhoon upon your doorstep\nYou're a victim of own misery\nAnd the god has made you pay\n\nWe are the wolves\nWe will not stand the pain\nWe are the wolves\nWe will never give up\nWe are the wolves\nWe will not leave\nWe are the wolves\nWe will never grow old\nWe are the wolves\nWe will never grow old "}]
|
34 |
```
|
35 |
|
36 |
Of course, it's possible to play with parameters like `top_k`, `top_p`, `temperature`, `max_length` and all the other parameters included in the `generate` method. Please look at the [documentation](https://huggingface.co/transformers/main_classes/model.html?highlight=generate#transformers.generation_utils.GenerationMixin.generate) for further insights.
|