Excited to announce *Open Responses* – a self-hosted alternative to OpenAI's new _Responses API_ that you can run locally, and use with ANY LLM model / provider and not just with OpenAI Responses API. What's more is that this is also compatible with their agents-sdk so everything just works out of the box!
To try it out, just run npx -y open-responses init (or uvx) and that's it! :)
We’d love feedback from the Hugging Face community on how it integrates with your pipelines (support for Hugging Face models landing soon!). Let’s push open-source AI forward together!