Samuel Azran
SamuelAzran
AI & ML interests
None yet
Recent Activity
liked
a model
9 days ago
Qwen/Qwen2.5-Omni-3B
liked
a dataset
27 days ago
HebArabNlpProject/HebrewSentiment
liked
a dataset
30 days ago
SLPRL-HUJI/HebDB
Organizations
None yet
SamuelAzran's activity
thanks , how to fine tune?
19
#1 opened 2 months ago
by
NickyNicky

New Gemma 2 27B?
2
#3 opened 11 months ago
by
SamuelAzran
Should not be called mixtral, the models made into the moe are yi based
18
9
#2 opened over 1 year ago
by
teknium

How does the MoE work?
1
3
#5 opened over 1 year ago
by
PacmanIncarnate
One or two models during inference?
3
#3 opened over 1 year ago
by
Venkman42

You know Mixtral, Llama 2 70b, GPT3.5... Are All Much Better
2
1
#13 opened over 1 year ago
by
deleted
Awesome- Could you help with pointers on doing same for Other languages(Swedish)?
3
#2 opened over 1 year ago
by
Olofp
NEW! OpenLLMLeaderboard 2023 fall update
17
20
#356 opened over 1 year ago
by
clefourrier

Can you release a chat version soon ?
11
#8 opened over 1 year ago
by
dong0213
Thank you very much!
6
10
#2 opened about 2 years ago
by
AiCreatornator
Error running the example code
1
21
#6 opened about 2 years ago
by
will33am
