set model_max_length to the maximum length of model context (131072 tokens) b2c0ece verified x0wllaar commited on 4 days ago