Commit
·
2968275
1
Parent(s):
d6035a0
Update README.md
Browse files
README.md
CHANGED
|
@@ -24,8 +24,8 @@ inference:
|
|
| 24 |
---
|
| 25 |
|
| 26 |
# Code Generation using GPT2-Large
|
| 27 |
-
This is a GPT2-large model that's further fine-tuned on the Codeparrot
|
| 28 |
-
|
| 29 |
|
| 30 |
## Model description
|
| 31 |
This Model has the same architecture and Parameters as the GPT2-large model. Please refer to this [link](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf) to know more about the model details.
|
|
|
|
| 24 |
---
|
| 25 |
|
| 26 |
# Code Generation using GPT2-Large
|
| 27 |
+
This is a GPT2-large model that's further fine-tuned on the Codeparrot dataset with a custom metric focused on code generation. <br>
|
| 28 |
+
The Tokenizer is initialized from the GPT2-large and further trained on the same dataset to better align the tokenization for generating code.
|
| 29 |
|
| 30 |
## Model description
|
| 31 |
This Model has the same architecture and Parameters as the GPT2-large model. Please refer to this [link](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf) to know more about the model details.
|