writinwaters commited on
Commit
dc34855
·
1 Parent(s): 5281167

Fixed a broken link (#2190)

Browse files

To fix a broken link

### Type of change

- [x] Documentation Update

Files changed (1) hide show
  1. docs/references/faq.md +2 -2
docs/references/faq.md CHANGED
@@ -357,7 +357,7 @@ This exception occurs when starting up the RAGFlow server. Try the following:
357
 
358
  1. Right click the desired dialog to display the **Chat Configuration** window.
359
  2. Switch to the **Model Setting** tab and adjust the **Max Tokens** slider to get the desired length.
360
- 3. Click **OK** to confirm your change.
361
 
362
 
363
  ### 2. What does Empty response mean? How to set it?
@@ -370,7 +370,7 @@ You limit what the system responds to what you specify in **Empty response** if
370
 
371
  ### 4. How to run RAGFlow with a locally deployed LLM?
372
 
373
- You can use Ollama to deploy local LLM. See [here](https://github.com/infiniflow/ragflow/blob/main/docs/guides/deploy_local_llm.md) for more information.
374
 
375
  ### 5. How to link up ragflow and ollama servers?
376
 
 
357
 
358
  1. Right click the desired dialog to display the **Chat Configuration** window.
359
  2. Switch to the **Model Setting** tab and adjust the **Max Tokens** slider to get the desired length.
360
+ 3. Click **OK** to confirm your change.
361
 
362
 
363
  ### 2. What does Empty response mean? How to set it?
 
370
 
371
  ### 4. How to run RAGFlow with a locally deployed LLM?
372
 
373
+ You can use Ollama to deploy local LLM. See [here](../guides/deploy_local_llm.mdx) for more information.
374
 
375
  ### 5. How to link up ragflow and ollama servers?
376