Improve model card: Add library, paper, GitHub links, and MoE tag
#2
by
nielsr
HF Staff
- opened
This PR improves the model card for SmallThinker-21BA3B-Instruct by:
- Adding
library_name: transformers
to the metadata, which enables the "Use in Transformers" widget on the model page. - Adding the
moe
tag to the metadata for better discoverability, as this model is a Mixture-of-Experts. - Including a direct link to the official Hugging Face paper page: SmallThinker: A Family of Efficient Large Language Models Natively Trained for Local Deployment.
- Adding a direct link to the main GitHub repository: https://github.com/SJTU-IPADS/SmallThinker.
These updates make the model more accessible and easier to understand for the community.
yixinsong
changed pull request status to
merged