Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
3
Sidra Bibi
FD900
Follow
0 followers
ยท
12 following
AI & ML interests
None yet
Recent Activity
updated
a Space
about 1 hour ago
FD900/Final_Assignment_Template
replied
to
burtenshaw
's
post
about 18 hours ago
You don't need remote APIs for a coding copliot, or the MCP Course! Set up a fully local IDE with MCP integration using Continue. In this tutorial Continue guides you through setting it up. This is what you need to do to take control of your copilot: 1. Get the Continue extension from the [VS Code marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue) to serve as the AI coding assistant. 2. Serve the model with an OpenAI compatible server in Llama.cpp / LmStudio/ etc. ``` llama-server -hf unsloth/Devstral-Small-2505-GGUF:Q4_K_M ``` 3. Create a `.continue/models/llama-max.yaml` file in your project to tell Continue how to use the local Ollama model. ``` name: Llama.cpp model version: 0.0.1 schema: v1 models: - provider: llama.cpp model: unsloth/Devstral-Small-2505-GGUF apiBase: http://localhost:8080 defaultCompletionOptions: contextLength: 8192 # Adjust based on the model name: Llama.cpp Devstral-Small roles: - chat - edit ``` 4. Create a `.continue/mcpServers/playwright-mcp.yaml` file to integrate a tool, like the Playwright browser automation tool, with your assistant. ``` name: Playwright mcpServer version: 0.0.1 schema: v1 mcpServers: - name: Browser search command: npx args: - "@playwright/mcp@latest" ``` Check out the full tutorial in the [the MCP course](https://huggingface.co/learn/mcp-course/unit2/continue-client)
reacted
to
burtenshaw
's
post
with ๐
about 18 hours ago
You don't need remote APIs for a coding copliot, or the MCP Course! Set up a fully local IDE with MCP integration using Continue. In this tutorial Continue guides you through setting it up. This is what you need to do to take control of your copilot: 1. Get the Continue extension from the [VS Code marketplace](https://marketplace.visualstudio.com/items?itemName=Continue.continue) to serve as the AI coding assistant. 2. Serve the model with an OpenAI compatible server in Llama.cpp / LmStudio/ etc. ``` llama-server -hf unsloth/Devstral-Small-2505-GGUF:Q4_K_M ``` 3. Create a `.continue/models/llama-max.yaml` file in your project to tell Continue how to use the local Ollama model. ``` name: Llama.cpp model version: 0.0.1 schema: v1 models: - provider: llama.cpp model: unsloth/Devstral-Small-2505-GGUF apiBase: http://localhost:8080 defaultCompletionOptions: contextLength: 8192 # Adjust based on the model name: Llama.cpp Devstral-Small roles: - chat - edit ``` 4. Create a `.continue/mcpServers/playwright-mcp.yaml` file to integrate a tool, like the Playwright browser automation tool, with your assistant. ``` name: Playwright mcpServer version: 0.0.1 schema: v1 mcpServers: - name: Browser search command: npx args: - "@playwright/mcp@latest" ``` Check out the full tutorial in the [the MCP course](https://huggingface.co/learn/mcp-course/unit2/continue-client)
View all activity
Organizations
spaces
1
Sleeping
Template Final Assignment
๐ต
models
0
None public yet
datasets
0
None public yet