Metal3d commited on
Commit
547adee
·
unverified ·
1 Parent(s): 55737cf

Add article link and fix history prompt

Browse files

Add the related article and add the forgotten history format to inject
the user question.

Files changed (1) hide show
  1. app.py +3 -1
app.py CHANGED
@@ -121,7 +121,7 @@ def bot(history: list, max_num_tokens: int, final_num_tokens: int):
121
  t.start()
122
 
123
  # rebuild the history with the new content
124
- history[-1].content += prepend
125
  if ANSWER_MARKER in prepend:
126
  # stop thinking, this is the answer now (no metadata for intermediate steps)
127
  history.append(gr.ChatMessage(role="assistant", content=""))
@@ -143,6 +143,8 @@ with gr.Blocks(fill_height=True, title="Making any model reasoning") as demo:
143
  This is a simple proof-of-concept to get any LLM model to reason ahead of its response.
144
  This interface uses *{model_name}* model which is **not** a reasoning model. The used method
145
  is only to force some "reasoning" steps with prefixes to help the model to enhance the answer.
 
 
146
  """)
147
  chatbot = gr.Chatbot(
148
  scale=1,
 
121
  t.start()
122
 
123
  # rebuild the history with the new content
124
+ history[-1].content += prepend.format(question=question)
125
  if ANSWER_MARKER in prepend:
126
  # stop thinking, this is the answer now (no metadata for intermediate steps)
127
  history.append(gr.ChatMessage(role="assistant", content=""))
 
143
  This is a simple proof-of-concept to get any LLM model to reason ahead of its response.
144
  This interface uses *{model_name}* model which is **not** a reasoning model. The used method
145
  is only to force some "reasoning" steps with prefixes to help the model to enhance the answer.
146
+
147
+ See related article here: [Make any model reasoning](https://huggingface.co/blog/Metal3d/making-any-model-reasoning)
148
  """)
149
  chatbot = gr.Chatbot(
150
  scale=1,