原文地址:enhancing-llms-reasoning-with-step-back-prompting
论文地址:https://arxiv.org/pdf/2310.06117.pdf
2023 年 11 月 6 日
Introduction
在大型语言模型不断发展的领域中,一个持续的挑战是它们处理复杂任务的能力,这些任务需要深入理解微妙的细节和上下文。
"后退提示"(Step-Back Prompting)现象已经作为一种创新方法出现,以解决这一问题,它识别了许多包含无数复杂性的任务。
这些任务可能使得大型语言模型难以有效地检索和应用相关信息。
Step Back Prompting
后退提示是一种用于增强语言模型的推理和问题解决能力的技巧,特别是LLMs。它涉及鼓励LLM从一个给定的问题或问题后退一步,提出一个更抽象、更高级的问题,涵盖原始查询的本质。
后退提示背后的概念是,许多复杂的问题或任务包含许多复杂的细节和约束,这使得LLMs难以直接检索和应用相关信息。
通过引入一个后退问题,这个问题通常更容易回答,并且围绕一个更广泛的概念或原则,LLMs 可以更有效地构建它们的推理。
Process of Step-Back Prompting
后退提示的典型过程包括两个主要步骤:
- 抽象化:这时大型语言模型(LLM)不会立即尝试回答原始问题。相反,它会提出一个关于更大想法或规则的一般性问题。这有助于它思考和寻找事实。
- 推理:在得到一般性问题的答案后,LLM使用这些信息来思考和回答原始问题。这称为“基于抽象的推理”。它使用来自更大想法的信息来对原始的、更难的问题给出好的答案。
Implementation with LangChain
提供一些少样本示例,这将展示后退提示是如何工作的。然后我们将转换这些示例信息。
# Few Shot Examples
examples = [
{
"input": "What is the birthplace of Albert Einstein?",
"output": "what is Albert Einstein's personal history?",
},
{
"input": "Can a Tesla car drive itself?",
"output": "what can a Tesla car do?",
},
{
"input": "Did Queen Elizabeth II ever visit Canada?",
"output": "what is Queen Elizabeth II's travel history?",
},
{
"input": "Can a SpaceX rocket land itself?",
"output": "what can a SpaceX rocket do?",
}
]
# We now transform these to example messages
example_prompt = ChatPromptTemplate.from_messages(
[
("human", "{input}"),
("ai", "{output}"),
]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
example_prompt=example_prompt,
examples=examples,
)
You are an expert at world knowledge.
Your task is to step back and paraphrase a question to a more generic
step-back question, which is easier to answer.
Here are a few examples:
Original Question: Which position did Knox Cunningham hold from May 1955 to Apr 1956?
Stepback Question: Which positions have Knox Cunning- ham held in his career?
Original Question: Who was the spouse of Anna Karina from 1968 to 1974?
Stepback Question: Who were the spouses of Anna Karina?
Original Question: Which team did Thierry Audel play for from 2007 to 2008?
Stepback Question: Which teams did Thierry Audel play for in his career?
Original Question: "Potassium-40 is a minor isotope found in naturally occurring potassium. It is radioactive and can be detected on simple radiation counters.
How many protons, neutrons, and electrons does potassium-40 have when it is part of K2SO4?
Choose an option from the list below:
0) 21 neutrons, 19 protons, 18 electrons
1) 20 neutrons, 19 protons, 19 electrons
2) 21 neutrons, 19 protons, 19 electrons
3) 19 neutrons, 19 protons, 19 electrons"
Stepback Question: "What are the chemistry principles behind this question?"
Principles:
"Atomic number: The atomic number of an element is the number of protons in the nucleus of an atom of that element."
Final Answer:
系统消息为模型设置了上下文和任务。少样本示例(包含在 few_shot_prompt 中)用于为模型提供额外的上下文和如何执行任务的示例。用户消息是用户可以输入他们具体问题的地方,然后模型将尝试将其重新表述为一个更通用的后退问题。
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"""You are an expert at world knowledge. Your task is to step back and paraphrase a question to a more generic step-back question, which is much easier to answer. Here are few examples:""",
),
# Few shot examples
few_shot_prompt,
# New question
("user", "{question}"),
]
)
现在,我们将为从LLM生成问题定义一个管道。它从预定义的对话模板(提示)开始,将对话传递给OpenAI GPT模型以生成具有固定随机性水平(温度=0)的响应,然后使用StrOutputParser处理并提取模型响应中的文本内容。最终结果应该是根据提示中的用户输入生成的问题。
question_gen = prompt | ChatOpenAI(temperature=0) | StrOutputParser()
Ask the question.
question = "What is the name of the rover that NASA landed on Mars in 2021?"
现在,我们将调用 `question_gen` 函数,并将一个包含问题的字典作为输入传递给它。这个过程将使用这个问题作为输入,遵循定义的管道,并生成一个或多个问题作为结果。
question_gen.invoke({"question": question})
这将是调用函数后生成的答案。
what rovers has NASA sent to Mars?
我们将设置一个简单的界面来执行DuckDuckGo搜索,并指定最大结果数量,这将允许我们通过使用查询调用`retriever`函数来轻松检索搜索结果。
search = DuckDuckGoSearchAPIWrapper(max_results=4)
def retriever(query):
return search.run(query)
我们将使用 `question` 调用 `retriever` 函数。
retriever(question)
在检索问题之后,我们将得到以下答案。
Perseverance landed on Mars in February 2021.
As of early February of this year, the rover had gathered 18 samples — and deposited half for a future potential return to Earth.
JPL-CALTECH/NASA The Curiosity rover team has been preparing for the start of the Solar Conjunction in November, when contact with all Mars spacecraft will be impossible for three weeks since Mars will be behind the Sun as seen from Earth.
NASA's Perseverance rover is busy just exploring Mars, looking for signs of ancient life.
Perseverance, nicknamed "Percy", the centerpiece of NASA's $2.7 billion Mars 2020 mission, touched down ... Since arriving at Jezero Crater in 2021, the six wheeled, nuclear-powered rover has been examining geologic features and collecting samples of the Red Planet that are central to the first step of the NASA-ESA (European Space Agency) Mars Sample Return campaign.
现在,我们将基于变量 `question` 生成问题,然后使用这些生成的问题作为搜索查询,通过 `retriever` 函数从DuckDuckGo检索搜索结果。
retriever(question_gen.invoke({"question": question}))
下面将是生成的答案。
From Wikipedia, the free encyclopedia This is a list of the 50 spacecraft missions (including unsuccessful ones) relating to the planet , such as orbiters and rovers.
Mission to Mars Gravity assist, destination elsewhere [1] [2] [3] ’1M No.1‘,’1M No.2‘,’1M No.2‘,’OKB-1 2MV-4 No.1‘,’2MV-4 No.1‘ Booster stage ("Block L") disintegrated in (2MV-4 No.2) NASA has sent a host of remotely-operated landers, orbiters and rovers to study Mars and bring back geologic samples.
While no humans have set foot on the planet, that could change.
NASA has ... The Subsurface Water Ice Mapping (SWIM), a NASA-funded project, has released its fourth and the more recent map of the prospective locations of subsurface water ice on Mars.
This, as per NASA officials, will help mission planners decide where exactly to send the first humans to Mars.
The blue areas on this map of Mars are regions where NASA ... Thanks in part to NASA's Curiosity Mars rover, geologists have discovered evidence for ancient rivers on the Red Planet, suggesting the right conditions for life.
从LangChain中心,我们将拉取一个“langchain-ai/stepback-answer”模型。
response_prompt = hub.pull("langchain-ai/stepback-answer")
`chain` 是一系列数据转换和操作,从基于用户的问题或后退问题的上下文提取开始,接着使用预定义的对话上下文与LLM进行交互,最后将模型的响应解析成可读的格式。
chain = ( { # Retrieve context using the normal question "normal_context": RunnableLambda(lambda x: x["question"]) | retriever, # Retrieve context using the step-back question "step_back_context": question_gen | retriever, # Pass on the question "question": lambda x: x["question"], } | response_prompt | ChatOpenAI(temperature=0) | StrOutputParser() )
现在,我们将使用 `question` 调用 `chain` 函数。
chain.invoke({"question": question})
下面将是最合适的生成的答案。
The name of the rover that NASA landed on Mars in 2021 is Perseverance.
然后,我们将给模型一个响应模板。
response_prompt_template = """You are an expert of world knowledge. I am going to ask you a question. Your response should be comprehensive and not contradicted with the following context if they are relevant. Otherwise, ignore them if they are not relevant.
{normal_context}
Original Question: {question}
Answer:"""
response_prompt = ChatPromptTemplate.from_template(response_prompt_template)
再次,我们将使用 `chain` 函数通过正常的问题检索上下文,然后使用预定义的响应模板传递问题。
chain = (
{
# Retrieve context using the normal question (only the first 3 results)
"normal_context": RunnableLambda(lambda x: x["question"]) | retriever,
# Pass on the question
"question": lambda x: x["question"],
}
| response_prompt
| ChatOpenAI(temperature=0)
| StrOutputParser()
)
使用 `question` 调用 `chain` 过程。
chain.invoke({"question": question})
下面将是生成的答案。
The name of the rover that NASA landed on Mars in 2021 is Perseverance, also known as "Percy".
It is the centerpiece of NASA's Mars 2020 mission, which successfully touched down on February 18, 2021, in Jezero Crater.
Perseverance is a six-wheeled, nuclear-powered rover that is currently exploring Mars and searching for signs of ancient life.
It is part of the NASA-ESA Mars Sample Return campaign and is tasked with collecting samples of the Red Planet for future analysis and potential return to Earth.
Conclusion
通过使用后退提示,大型语言模型可以减少在推理过程中犯错的概率,更有效地处理复杂任务,并提供对复杂问题更准确、更细致的回答。
这种技术有望提高语言模型在各种需要深入理解和复杂推理的领域和应用程序中的实际效用。