What is the length of the context?
#27
by
poarpeak
- opened
2K? 32K or 100K?
"Models are trained on a context length of 8192 tokens" can be found in the research paper or in the config.json file of the model.
osanseviero
changed discussion status to
closed