
lamm-mit/Llama-3.2-3B-Instruct-Sparse-GIN-orca-math-word-problems
Updated
•
10
•
1
We present an approach to modifying Transformer architectures by integrating graph-aware relational reasoning into the attention mechanism.