max_sequence_length value
#1
by
SatoshiReport
- opened
First, thank you providing these great models!
max_sequence_length in config.json is set to 2048
However on the Model Card page it says Seq Len is 4096
Should the config.json be changed to reflect 4096?
No, the value in config.json is correct. All Llama 1 models have 2048 sequence length.
It's the README that's wrong - apologies. I'll fix that.