Inference Providers documentation
Integrations Overview
Integrations Overview
Hugging Face Inference Providers works with a growing ecosystem of developer tools, frameworks, and platforms. These integrations let you use state-of-the-art models in your existing workflows and development environments.
If a tool doesn’t have explicit support for Inference Providers, it is often still compatible via its OpenAI-compatible API support. Check the documentation for your tool to see if it can be configured to use custom endpoints.
Why Use Integrations?
- Keep your existing tools: Use Inference Providers with tools you already know
- Access dozens of providers: Switch between providers without changing your code
- Zero markup pricing: Get the same rates as going direct to providers
- Single API token: One HF token for all providers and models
Overview
This table lists some tools, libraries, and applications that work with Hugging Face Inference Providers. For detailed setup instructions, follow the links in the Resources column.
| Integration | Description | Resources |
|---|---|---|
| CrewAI | Framework for orchestrating AI agent teams | Official docs |
| GitHub Copilot Chat | AI pair programmer in VS Code | HF docs |
| fast-agent | Flexible framework building MCP/ACP powered Agents, Workflows and evals | Official docs |
| Haystack | Open-source LLM framework for building production applications | Official docs |
| Inspect | AI safety and evaluation framework | Official docs |
| LangChain | LLM application framework | Official docs |
| LiteLLM | Unified interface for 100+ LLMs | Official docs |
| LlamaIndex | Data framework for LLM applications | Official docs |
| MacWhisper | Speech-to-text application for macOS | HF docs |
| OpenCode | AI coding agent built for the terminal | Official docs / HF docs |
| PydanticAI | Framework for building AI agents with Python | Official docs |
| Roo Code | AI-powered code generation and refactoring | Official docs |
| smolagents | Framework for building LLM agents with tool integration | Official docs |
Integrations by Category
API Clients
Client libraries and gateways for simplified LLM access.
- LiteLLM - Unified interface for calling 100+ LLMs with the same format (Official docs)
Applications
End-user applications and interfaces powered by LLMs.
- MacWhisper - Speech-to-text application for macOS (HF docs)
Developer Tools
AI-powered coding assistants and development environments.
- GitHub Copilot Chat - AI pair programmer in VS Code (HF docs)
- OpenCode - AI coding agent built for the terminal (Official docs / HF docs)
- Roo Code - AI-powered code generation and refactoring (Official docs)
Evaluation Frameworks
Tools for assessing and ensuring AI safety and performance.
- Inspect - AI safety and evaluation framework (Official docs)
LLM Frameworks
LLM application frameworks and orchestration platforms.
- CrewAI - Framework for orchestrating AI agent teams (Official docs)
- fast-agent - Flexible framework building MCP/ACP powered Agents, Workflows and evals (Official docs)
- Haystack - Open-source framework for building production-ready LLM applications (Official docs)
- LangChain - Popular framework for developing LLM applications (Official docs)
- LlamaIndex - Data framework for connecting custom data to LLMs (Official docs)
- PydanticAI - Framework for building AI agents with Python (Official docs)
- smolagents - Framework for building LLM agents with tool integration (Official docs)