Large Language Diffusion with Ordered Unmasking (LLaDOU)

ArXiv ArXiv

We introduce the Large Language Diffusion with Ordered Unmasking (LLaDOU), which is trained by reinforcing a new reasoning paradigm named the Diffusion Chain of Lateral Thought (DCoLT) for diffusion language models.

Compared to standard CoT, DCoLT is distinguished with several notable features:

  • Bidirectional Reasoning: Allowing global refinement throughout generations with bidirectional self-attention masks.
  • Format-Free Reasoning: No strict rule on grammatical correctness amid its intermediate steps of thought.
  • Nonlinear Generation: Generating tokens at various positions in different steps.

Demonstration of DCoLT

Instructions

LLaDOU-v0-Math is a math-specific model trained on GSM8K and MATH.

For inference codes and detailed instructions, please refer our github page: maple-research-lab/LLaDOU.

Downloads last month
185
Safetensors
Model size
8.32B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for maple-research-lab/LLaDOU-v0-Math

Finetuned
(4)
this model