Core ML Projects

community
Activity Feed

AI & ML interests

Take the Hub to iOS and macOS

Recent Activity

coreml-projects's activity

XenovaĀ 
posted an update 1 day ago
view post
Post
818
NEW: Real-time conversational AI models can now run 100% locally in your browser! 🤯

šŸ” Privacy by design (no data leaves your device)
šŸ’° Completely free... forever
šŸ“¦ Zero installation required, just visit a website
āš”ļø Blazingly-fast WebGPU-accelerated inference

Try it out: webml-community/conversational-webgpu

For those interested, here's how it works:
- Silero VAD for voice activity detection
- Whisper for speech recognition
- SmolLM2-1.7B for text generation
- Kokoro for text to speech

Powered by Transformers.js and ONNX Runtime Web! šŸ¤— I hope you like it!
  • 1 reply
Ā·
reach-vbĀ 
posted an update 17 days ago
view post
Post
3644
hey hey @mradermacher - VB from Hugging Face here, we'd love to onboard you over to our optimised xet backend! šŸ’„

as you know we're in the process of upgrading our storage backend to xet (which helps us scale and offer blazingly fast upload/ download speeds too): https://huggingface.co/blog/xet-on-the-hub and now that we are certain that the backend can scale with even big models like Llama 4/ Qwen 3 - we;re moving to the next phase of inviting impactful orgs and users on the hub over as you are a big part of the open source ML community - we would love to onboard you next and create some excitement about it in the community too!

in terms of actual steps - it should be as simple as one of the org admins to join hf.co/join/xet - we'll take care of the rest.

p.s. you'd need to have a the latest hf_xet version of huggingface_hub lib but everything else should be the same: https://huggingface.co/docs/hub/storage-backends#using-xet-storage

p.p.s. this is fully backwards compatible so everything will work as it should! šŸ¤—
Ā·
XenovaĀ 
posted an update about 1 month ago
view post
Post
8072
Introducing the ONNX model explorer: Browse, search, and visualize neural networks directly in your browser. 🤯 A great tool for anyone studying Machine Learning! We're also releasing the entire dataset of graphs so you can use them in your own projects! šŸ¤—

Check it out! šŸ‘‡
Demo: onnx-community/model-explorer
Dataset: onnx-community/model-explorer
Source code: https://github.com/xenova/model-explorer
victorĀ 
posted an update about 1 month ago
view post
Post
4636
DIA TTS is just amazing - please share your funniest gens (here is mine) šŸ˜‚
nari-labs/Dia-1.6B
pagezyhfĀ 
posted an update about 1 month ago
view post
Post
1978
If you haven't had the chance to test the latest open model from Meta, Llama 4 Maverick, go try it on AMD MI 300 on Hugging Face!

amd/llama4-maverick-17b-128e-mi-amd
XenovaĀ 
posted an update about 2 months ago
view post
Post
2675
Reasoning models like o3 and o4-mini are advancing faster than ever, but imagine what will be possible when they can run locally in your browser! 🤯

Well, with šŸ¤— Transformers.js, you can do just that! Here's Zyphra's new ZR1 model running at over 100 tokens/second on WebGPU! āš”ļø

Giving models access to browser APIs (like File System, Screen Capture, and more) could unlock an entirely new class of web experiences that are personalized, interactive, and run locally in a secure, sandboxed environment.

For now, try out the demo! šŸ‘‡
webml-community/Zyphra-ZR1-WebGPU
  • 1 reply
Ā·