metadata
language:
- en
- hi
- zh
- es
- fr
- de
- ja
- ko
- ar
- pt
- ru
- it
- nl
- tr
- pl
- sv
- da
- 'no'
- fi
- he
- th
- vi
- id
- ms
- tl
- sw
- yo
- zu
- am
- bn
- gu
- kn
- ml
- mr
- ne
- or
- pa
- ta
- te
- ur
- multilingual
license: apache-2.0
base_model: HelpingAI/Dhanishtha-2.0-preview-0825
tags:
- reasoning
- intermediate-thinking
- transformers
- conversational
- bilingual
- mlx
datasets:
- Abhaykoul/Dhanishtha-R1
- open-thoughts/OpenThoughts-114k
- Abhaykoul/Dhanishtha-2.0-SUPERTHINKER
- Abhaykoul/Dhanishtha-2.0
library_name: mlx
pipeline_tag: text-generation
widget:
- text: >-
Solve this riddle step by step: I am taken from a mine, and shut up in a
wooden case, from which I am never released, and yet I am used by almost
everybody. What am I?
example_title: Complex Riddle Solving
- text: >-
Explain the philosophical implications of artificial consciousness and
think through different perspectives.
example_title: Philosophical Reasoning
- text: >-
Help me understand quantum mechanics, but take your time to think through
the explanation.
example_title: Educational Explanation
Dhanishtha-2.0-preview-0825-q8-hi-mlx
This model Dhanishtha-2.0-preview-0825-q8-hi-mlx was converted to MLX format from HelpingAI/Dhanishtha-2.0-preview-0825 using mlx-lm version 0.26.1.
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("Dhanishtha-2.0-preview-0825-q8-hi-mlx")
prompt = "hello"
if tokenizer.chat_template is not None:
messages = [{"role": "user", "content": prompt}]
prompt = tokenizer.apply_chat_template(
messages, add_generation_prompt=True
)
response = generate(model, tokenizer, prompt=prompt, verbose=True)