The superhot lora for whatever reason doesn't take. Perplexity gets much too bad. Maybe if ALL you're doing is sending it constant 3k+ contexts it might be better. use alpha.
· Sign up or log in to comment