WebLLM Phi 3.5 Chat
This space enables AI chat with Phi 3.5 models directly in your local browser, empowered by WebLLM.
Step 1: Configure And Download Model
Quantization:
q4f16
q4f32
Context Window:
1K
2K
4K
8K
16K
32K
64K
128K
Temperature:
1.00
Top P:
1.00
Presence Penalty:
0.00
Frequency Penalty:
0.00
Loading...
Step 2: Chat
Send