Breaking Language Barriers in Mathematical AI: Introducing Hebrew Math Tutor

Community Article Published September 7, 2025

Mathematical reasoning shouldn't be limited by language. While state-of-the-art AI models excel at solving complex math problems, most are designed primarily for English speakers, leaving Hebrew-speaking students and educators under-served. Today, we're excited to change that.

Hebrew Math Tutor (Intel/hebrew-math-tutor-v1) brings advanced mathematical problem-solving capabilities directly to Hebrew speakers, providing detailed step-by-step reasoning entirely in Hebrew without sacrificing the computational accuracy that makes these models valuable for education.

The Challenge: Math AI in a Multilingual World

Advanced mathematical AI models like those trained on competition mathematics datasets have shown remarkable problem-solving abilities. However, they primarily operate in English, creating barriers for non-English speaking educational communities. Hebrew speakers, in particular, have faced challenges accessing these powerful educational tools in their native language.

Simply translating outputs isn't enoughโ€”effective mathematical tutoring requires natural language flow, culturally appropriate explanations, and seamless integration of Hebrew text with mathematical notation. This requires a more sophisticated approach.

Our Solution: Purpose-Built Hebrew Mathematical Reasoning

Hebrew Math Tutor addresses these challenges through targeted fine-tuning of Qwen3-4B-Thinking-2507, a powerful 4-billion parameter mathematical reasoning model. Our approach focuses on three key principles:

๐ŸŽฏ Native Hebrew Mathematical Discourse

The model provides complete mathematical explanations in natural Hebrew while preserving mathematical notation and formal expressions. It understands Hebrew mathematical terminology and can explain complex concepts using appropriate pedagogical language.

๐Ÿ”ฌ Preserved Computational Accuracy

By carefully fine-tuning rather than training from scratch, we maintain the model's core mathematical reasoning capabilities while adapting its communication style to Hebrew.

โšก Efficient and Accessible

At ~4 billion parameters, the model strikes an optimal balance between capability and computational efficiency, making it practical for educational applications and research prototyping.

Training Approach

Creating an effective Hebrew math model required more than simple translation. Our methodology involved:

Curated Dataset

We selected ~10,000 high-quality problems from the OpenMathReasoning dataset, translating questions and answers to Hebrew while preserving the original reasoning chains and mathematical notation.

Supervised Fine-Tuning

We fine-tuned the model over 3 epochs with optimized parameters (learning rate 5e-6, 0.1 warmup, cosine scheduling) to adapt the output language while maintaining the underlying reasoning capabilities.

Preserved Internal Reasoning

The model's internal <think>...</think> reasoning blocks remain in English, as these represent core computational processes that would require more extensive training to modify.

Impressive Results Across Multiple Benchmarks

We evaluated Hebrew Math Tutor against its base model on three challenging mathematical benchmarks: MATH500 (curriculum problems), AIME24, and AIME25 (competition mathematics). The results demonstrate significant improvements in Hebrew language output while maintaining strong technical performance.

Hebrew Evaluation Performance

Dataset Metric Base Model Hebrew Math Tutor Improvement
MATH500 Hebrew Answers 75% 100% +25%
pass@16 93% 95% +2%
maj@16 88% 90% +2%
AIME24 Hebrew Answers 35.2% 96.7% +61.5%
pass@16 76.7% 80% +3.3%
maj@16 76.7% 76.7% No change
AIME25 Hebrew Answers 36% 95.2% +59.2%
pass@16 80% 83.3% +3.3%
maj@16 70% 60% -10%

English Performance (Baseline Preservation)

Dataset Metric Base Model Hebrew Math Tutor Change
MATH500 pass@16 99% 98% -1%
maj@16 98% 98% No change
AIME24 pass@16 93.3% 90% -3.3%
maj@16 86.7% 86.7% No change
AIME25 pass@16 83.3% 90% +6.7%
maj@16 73% 80% +7%

Key Performance Insights

๐Ÿš€ Dramatic Hebrew Language Gains: Hebrew answer production jumped from 35-75% to 95-100% across all benchmarksโ€”a transformative improvement for Hebrew-speaking users.

๐Ÿ“ˆ Consistent Accuracy Improvements: Notable gains in pass@16 scores on Hebrew evaluations, showing the model doesn't just translate but actually improves problem-solving in Hebrew contexts.

๐Ÿ”„ Preserved Core Capabilities: Maintained competitive English performance, demonstrating that Hebrew specialization didn't compromise the model's fundamental mathematical abilities.

โš–๏ธ Nuanced Majority Vote Results: While performance improved on MATH500 and remained stable on AIME24, there's an interesting decrease in maj@16 on AIME25 that provides insights for future training approaches.

Real-World Impact and Applications

Hebrew Math Tutor opens new possibilities across multiple domains:

Educational Technology

  • Hebrew-first tutoring systems that provide natural, step-by-step mathematical explanations
  • Accessible learning platforms helping Hebrew-speaking students access AI-powered math assistance
  • Curriculum development tools for creating Hebrew mathematical content

Research and Development

  • Multilingual AI research exploring mathematical reasoning across languages
  • Educational assessment studying how language affects mathematical learning
  • Prototype development for Hebrew-language educational applications

Community Impact

  • Reduced language barriers in accessing advanced mathematical AI tools
  • Educational equity by providing Hebrew speakers with capabilities previously available only in English
  • Cultural preservation by enabling mathematical discourse in Hebrew

Getting Started: Simple Integration

Hebrew Math Tutor integrates seamlessly with the Transformers ecosystem:

from transformers import pipeline

model = "Intel/hebrew-math-tutor-v1"
pipe = pipeline("text-generation", model)

messages = [
    {
        "role": "system",
        "content": """You are a helpful AI assistant specialized in mathematics and problem-solving who can answer math questions with the correct answer.
Answer shortly, not more than 500 tokens, but outline the process step by step.
Answer ONLY in Hebrew!""",
    },
    {"role": "user", "content": "ืžื”ื• ืกื›ื•ื ื”ืกื“ืจื” ื”ื‘ืื”:  1 + 1/2 + 1/4 + 1/8 + ..."},
]

out = pipe(
    messages,
    return_full_text=False,
    max_new_tokens=1024,
    temperature=0.6,
    top_p=0.95,
    top_k=20,
)
print(out[0]["generated_text"])

Optimization Tips for Best Results

  • ๐ŸŽฏ Be explicit: Request step-by-step reasoning and clearly marked final answers
  • ๐Ÿ”ง Optimal parameters: Use temperature=0.6, top_p=0.95, top_k=20 for balanced creativity and accuracy
  • ๐Ÿ“ Format requests: Include Hebrew cues like "ืชืฉื•ื‘ื” ืกื•ืคื™ืช:" for consistent output structure
  • ๐Ÿ” Verify results: Always validate mathematical solutions, especially in educational contexts


Hebrew Math Tutor in action: A Streamlit interface showing detailed step-by-step reasoning in Hebrew. The expandable reasoning sections allow users to dive deep into the mathematical process or focus on final answers.

Responsible AI: Deployment Considerations

While Hebrew Math Tutor represents significant progress, responsible deployment requires careful consideration:

โœ… Recommended Use Cases

  • Educational prototyping and research applications.
  • Hebrew-language tutoring interface development.
  • Mathematical content generation for Hebrew educational materials.
  • Accessibility tools for Hebrew-speaking mathematics learners.

โš ๏ธ Important Limitations to Consider

  • Verification required: Mathematical solutions should always be validated before use in instructional settings.
  • Not for high-stakes assessment: Avoid using for critical evaluations without human oversight.
  • Language mixing potential: May occasionally mix Hebrew and English or use inconsistent formatting.
  • Training bias inheritance: Reflects potential biases present in original training datasets.

๐Ÿ›ก๏ธ Ethical Guidelines

The model works best as an educational aid rather than a replacement for qualified instruction. We recommend implementing human oversight, providing clear disclaimers about AI-generated content, and ensuring compliance with relevant privacy regulations in educational applications.

Looking Forward: The Future of Multilingual Mathematical AI

Hebrew Math Tutor demonstrates that language barriers in AI can be effectively addressed through thoughtful fine-tuning approaches. This work represents more than just a Hebrew mathematical modelโ€”it's a proof of concept for making advanced AI capabilities truly accessible across linguistic communities.

The techniques developed here can be adapted for other languages, creating a pathway toward more inclusive mathematical AI tools. As we continue to refine these approaches, we're moving closer to a future where language is no longer a barrier to accessing the most advanced educational technologies.

Get Involved

Hebrew Math Tutor is available now under the Apache-2.0 license. We encourage the community to:

  • Try the model and share feedback on its performance in Hebrew mathematical contexts.
  • Build applications that leverage its capabilities for educational and research purposes.
  • Contribute insights about multilingual mathematical AI development.
  • Explore adaptations for other languages and educational domains.

๐Ÿš€ Start exploring Hebrew Math Tutor today and experience mathematical AI that truly speaks your language.


Citation & Attribution

@misc{hebrew-math-tutor-v1,
  title={Hebrew Math Tutor: A Hebrew-focused Mathematical Reasoning Model},
  author={Intel AI},
  year={2025},
  url={https://huggingface.co/Intel/hebrew-math-tutor-v1},
  note={Fine-tuned from Qwen3-4B-Thinking-2507}
}

Built with gratitude upon the foundational work of Qwen3-4B-Thinking-2507 and the OpenMathReasoning dataset.

Community

Sign up or log in to comment