Spaces:
				
			
			
	
			
			
					
		Running
		
	
	
	
			
			
	
	
	
	
		
		FastAPI POST Endpoint Not Working with Gradio MCP Server on Hugging Face Spaces
Hi all,
I am trying to add a FastAPI POST endpoint to my Gradio MCP server app (running on Hugging Face Spaces) so that my agent can send POST requests to /gradio_api/mcp/messages/. The goal is to allow my agent to communicate with the MCP server using POST, as required by the agent’s protocol.
I followed the recommended approach of attaching a FastAPI POST handler after launching the Gradio app, like this:
if name == "main":
    app = demo.launch(mcp_server=True, ...)
    try:
        from fastapi import Request
        from fastapi.responses import JSONResponse
        if hasattr(app, "app") and app.app is not None:
            fastapi_app = app.app
            @fastapi_app.post("/gradio_api/mcp/messages/")
            async def mcp_messages(request: Request):
                data = await request.json()
                return JSONResponse({"status": "received", "data": data})
    except Exception as e:
        print("Exception while registering FastAPI POST handler:", e)
However, I keep getting errors such as:
Import "fastapi" could not be resolved (even though FastAPI is in requirements.txt)
Or, the app fails to initialize on Spaces with a 500 error or “application does not seem to be initialized”.
Why I need this:
My agent connects to the MCP server via SSE, but also needs to POST messages to /gradio_api/mcp/messages/. Without this POST endpoint, the agent cannot communicate fully with the server.
Questions:
Is this the correct way to add a FastAPI POST endpoint to a Gradio app on Hugging Face Spaces?
Is there a better or more reliable way to expose custom POST endpoints alongside Gradio’s MCP server?
Are there any known issues with FastAPI support or import resolution on Hugging Face Spaces?
