Spaces:
Runtime error
Runtime error
Update chain.py
Browse files
chain.py
CHANGED
|
@@ -95,32 +95,9 @@ def get_new_chain1(vectorstore, rephraser_llm, final_output_llm, isFlan) -> Chai
|
|
| 95 |
input_variables=["page_content", "source"],
|
| 96 |
)
|
| 97 |
|
| 98 |
-
gpt_template = """You are an AI assistant for the open source transformers library provided by Hugging Face. The documentation is located at https://huggingface.co/docs/transformers.
|
| 99 |
-
- You are given extracted parts of a long document and a question.
|
| 100 |
-
- Provide a conversational answer with a hyperlink to the documentation based on the "source".
|
| 101 |
-
- Do NOT add .html to the end of links. Make sure to bold link text.
|
| 102 |
-
- You should only use hyperlinks that are explicitly listed as a source in the context. Do NOT make up a hyperlink that is not listed.
|
| 103 |
-
- If the question includes a request for code, provide a code block directly from the documentation.
|
| 104 |
-
- If you don't know the answer, just say "Hmm, I'm not sure." Don't try to make up an answer.
|
| 105 |
-
- If the question is not about Hugging Face Transformers, politely inform them that you are tuned to only answer questions about Transformers.
|
| 106 |
-
|
| 107 |
-
For example, if someone asks how to install Transformers, you should say:
|
| 108 |
-
|
| 109 |
-
You can install with pip:
|
| 110 |
-
'''py
|
| 111 |
-
pip install transformers
|
| 112 |
-
'''
|
| 113 |
-
**(Source)**[https://huggingface.co/docs/transformers/main/en/installation]
|
| 114 |
-
|
| 115 |
-
Question: {question}
|
| 116 |
-
=========
|
| 117 |
-
{context}
|
| 118 |
-
=========
|
| 119 |
-
Answer in Markdown:"""
|
| 120 |
-
|
| 121 |
flan_template = """
|
| 122 |
{context}
|
| 123 |
-
Based on the above documentation, answer the user's question in markdown: {question}"""
|
| 124 |
|
| 125 |
PROMPT = PromptTemplate(template=gpt_template, input_variables=["question", "context"])
|
| 126 |
|
|
|
|
| 95 |
input_variables=["page_content", "source"],
|
| 96 |
)
|
| 97 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 98 |
flan_template = """
|
| 99 |
{context}
|
| 100 |
+
Based on the above documentation, answer the user's question in markdown. If you can't answer, say "For this topic, I reccomend viewing the docs": {question}"""
|
| 101 |
|
| 102 |
PROMPT = PromptTemplate(template=gpt_template, input_variables=["question", "context"])
|
| 103 |
|