Papers in terms of LLM LongContext Compression way, Reading Details: https://www.notion.so/LLM-LongContext-Compression-323cc6da39124c3a97d3502e1bf61b7
-
Adapting Language Models to Compress Contexts
Paper • 2305.14788 • Published • 1 -
Soaring from 4K to 400K: Extending LLM's Context with Activation Beacon
Paper • 2401.03462 • Published • 27 -
Flexibly Scaling Large Language Models Contexts Through Extensible Tokenization
Paper • 2401.07793 • Published • 3 -
Say More with Less: Understanding Prompt Learning Behaviors through Gist Compression
Paper • 2402.16058 • Published