Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
1
4
Varun Gupta
VarunGuptaPy
Follow
hx7k0's profile picture
Lokesh-CODER's profile picture
2 followers
ยท
4 following
AI & ML interests
None yet
Recent Activity
reacted
to
Abhaykoul
's
post
with ๐
3 days ago
Introducing Dhanishtha 2.0: World's first Intermediate Thinking Model Dhanishtha 2.0 is the world's first LLM designed to think between the responses. Unlike other Reasoning LLMs, which think just once. Dhanishtha can think, rethink, self-evaluate, and refine in between responses using multiple <think> blocks. This technique makes it Hinghlt Token efficient it Uses up to 79% fewer tokens than DeepSeek R1 --- You can try our model from: https://helpingai.co/chat Also, we're gonna Open-Source Dhanistha on July 1st. --- For Devs: ๐ Get your API key at https://helpingai.co/dashboard ``` from HelpingAI import HAI # pip install HelpingAI==1.1.1 from rich import print hai = HAI(api_key="hl-***********************") response = hai.chat.completions.create( model="Dhanishtha-2.0-preview", messages=[{"role": "user", "content": "What is the value of โซ0โ๐ฅ3/๐ฅโ1๐๐ฅ ?"}], stream=True, hide_think=False # Hide or show models thinking ) for chunk in response: print(chunk.choices[0].delta.content, end="", flush=True) ```
reacted
to
Abhaykoul
's
post
with ๐ฅ
3 days ago
Introducing Dhanishtha 2.0: World's first Intermediate Thinking Model Dhanishtha 2.0 is the world's first LLM designed to think between the responses. Unlike other Reasoning LLMs, which think just once. Dhanishtha can think, rethink, self-evaluate, and refine in between responses using multiple <think> blocks. This technique makes it Hinghlt Token efficient it Uses up to 79% fewer tokens than DeepSeek R1 --- You can try our model from: https://helpingai.co/chat Also, we're gonna Open-Source Dhanistha on July 1st. --- For Devs: ๐ Get your API key at https://helpingai.co/dashboard ``` from HelpingAI import HAI # pip install HelpingAI==1.1.1 from rich import print hai = HAI(api_key="hl-***********************") response = hai.chat.completions.create( model="Dhanishtha-2.0-preview", messages=[{"role": "user", "content": "What is the value of โซ0โ๐ฅ3/๐ฅโ1๐๐ฅ ?"}], stream=True, hide_think=False # Hide or show models thinking ) for chunk in response: print(chunk.choices[0].delta.content, end="", flush=True) ```
liked
a model
about 2 months ago
UnfilteredAI/NSFW-gen-v2
View all activity
Organizations
models
3
Sort:ย Recently updated
VarunGuptaPy/HelpingAI2.5-10B-Q4_K_M-GGUF
Text Generation
โข
10B
โข
Updated
Mar 29
โข
12
VarunGuptaPy/Helpingai3-raw-Q4_K_M-GGUF
Text Generation
โข
10B
โข
Updated
Mar 29
โข
7
VarunGuptaPy/HelpingAI-3-Q4_K_M-GGUF
Text Generation
โข
10B
โข
Updated
Mar 29
โข
3
datasets
0
None public yet