writinwaters
commited on
Commit
·
3366eac
1
Parent(s):
533089d
Differentiated API key names to avoid confusion. (#3107)
Browse files### What problem does this PR solve?
### Type of change
- [x] Documentation Update
- docs/guides/llm_api_key_setup.md +11 -11
docs/guides/llm_api_key_setup.md
CHANGED
@@ -3,13 +3,13 @@ sidebar_position: 5
|
|
3 |
slug: /llm_api_key_setup
|
4 |
---
|
5 |
|
6 |
-
# Configure
|
7 |
|
8 |
-
An API key is required for RAGFlow to interact with an online AI model. This guide provides information about setting your API key in RAGFlow.
|
9 |
|
10 |
-
## Get
|
11 |
|
12 |
-
For now, RAGFlow supports the following online LLMs. Click the corresponding link to apply for your API key. Most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
|
13 |
|
14 |
- [OpenAI](https://platform.openai.com/login?launch)
|
15 |
- [Azure-OpenAI](https://ai.azure.com/)
|
@@ -32,14 +32,14 @@ For now, RAGFlow supports the following online LLMs. Click the corresponding lin
|
|
32 |
If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinferenc, or LocalAI](./deploy_local_llm.mdx).
|
33 |
:::
|
34 |
|
35 |
-
## Configure
|
36 |
|
37 |
-
You have two options for configuring your API key:
|
38 |
|
39 |
- Configure it in **service_conf.yaml** before starting RAGFlow.
|
40 |
- Configure it on the **Model Providers** page after logging into RAGFlow.
|
41 |
|
42 |
-
### Configure API key before starting up RAGFlow
|
43 |
|
44 |
1. Navigate to **./docker/ragflow**.
|
45 |
2. Find entry **user_default_llm**:
|
@@ -51,10 +51,10 @@ You have two options for configuring your API key:
|
|
51 |
|
52 |
*After logging into RAGFlow, you will find your chosen model appears under **Added models** on the **Model Providers** page.*
|
53 |
|
54 |
-
### Configure API key after logging into RAGFlow
|
55 |
|
56 |
:::caution WARNING
|
57 |
-
After logging into RAGFlow, configuring API key through the **service_conf.yaml** file will no longer take effect.
|
58 |
:::
|
59 |
|
60 |
After logging into RAGFlow, you can *only* configure API Key on the **Model Providers** page:
|
@@ -62,11 +62,11 @@ After logging into RAGFlow, you can *only* configure API Key on the **Model Prov
|
|
62 |
1. Click on your logo on the top right of the page **>** **Model Providers**.
|
63 |
2. Find your model card under **Models to be added** and click **Add the model**:
|
64 |

|
65 |
-
3. Paste your API key.
|
66 |
4. Fill in your base URL if you use a proxy to connect to the remote service.
|
67 |
5. Click **OK** to confirm your changes.
|
68 |
|
69 |
:::note
|
70 |
-
|
71 |

|
72 |
:::
|
|
|
3 |
slug: /llm_api_key_setup
|
4 |
---
|
5 |
|
6 |
+
# Configure model API key
|
7 |
|
8 |
+
An API key is required for RAGFlow to interact with an online AI model. This guide provides information about setting your model API key in RAGFlow.
|
9 |
|
10 |
+
## Get model API key
|
11 |
|
12 |
+
For now, RAGFlow supports the following online LLMs. Click the corresponding link to apply for your model API key. Most LLM providers grant newly-created accounts trial credit, which will expire in a couple of months, or a promotional amount of free quota.
|
13 |
|
14 |
- [OpenAI](https://platform.openai.com/login?launch)
|
15 |
- [Azure-OpenAI](https://ai.azure.com/)
|
|
|
32 |
If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinferenc, or LocalAI](./deploy_local_llm.mdx).
|
33 |
:::
|
34 |
|
35 |
+
## Configure model API key
|
36 |
|
37 |
+
You have two options for configuring your model API key:
|
38 |
|
39 |
- Configure it in **service_conf.yaml** before starting RAGFlow.
|
40 |
- Configure it on the **Model Providers** page after logging into RAGFlow.
|
41 |
|
42 |
+
### Configure model API key before starting up RAGFlow
|
43 |
|
44 |
1. Navigate to **./docker/ragflow**.
|
45 |
2. Find entry **user_default_llm**:
|
|
|
51 |
|
52 |
*After logging into RAGFlow, you will find your chosen model appears under **Added models** on the **Model Providers** page.*
|
53 |
|
54 |
+
### Configure model API key after logging into RAGFlow
|
55 |
|
56 |
:::caution WARNING
|
57 |
+
After logging into RAGFlow, configuring your model API key through the **service_conf.yaml** file will no longer take effect.
|
58 |
:::
|
59 |
|
60 |
After logging into RAGFlow, you can *only* configure API Key on the **Model Providers** page:
|
|
|
62 |
1. Click on your logo on the top right of the page **>** **Model Providers**.
|
63 |
2. Find your model card under **Models to be added** and click **Add the model**:
|
64 |

|
65 |
+
3. Paste your model API key.
|
66 |
4. Fill in your base URL if you use a proxy to connect to the remote service.
|
67 |
5. Click **OK** to confirm your changes.
|
68 |
|
69 |
:::note
|
70 |
+
To update an existing model API key at a later point:
|
71 |

|
72 |
:::
|