From 509820f6dc12d12a88f2c3c16f309cf4e42c0678 Mon Sep 17 00:00:00 2001
From: Fan Pu Zeng <zengfanpu@gmail.com>
Date: Tue, 22 Oct 2024 22:07:30 -0400
Subject: [PATCH] add summary

---
 ...ge-intensive-multi-step-questions.markdown | 29 ++++++++++++++++---
 1 file changed, 25 insertions(+), 4 deletions(-)

diff --git a/_summaries/2024-10-03-interleaving-retrieval-with-chain-of-thought-reasoning-for-knowledge-intensive-multi-step-questions.markdown b/_summaries/2024-10-03-interleaving-retrieval-with-chain-of-thought-reasoning-for-knowledge-intensive-multi-step-questions.markdown
index d50a1caf6..bd897b8b5 100644
--- a/_summaries/2024-10-03-interleaving-retrieval-with-chain-of-thought-reasoning-for-knowledge-intensive-multi-step-questions.markdown
+++ b/_summaries/2024-10-03-interleaving-retrieval-with-chain-of-thought-reasoning-for-knowledge-intensive-multi-step-questions.markdown
@@ -9,9 +9,30 @@ bib_id: 2212.10509v2
 
 #### 1. Interleaved Retrieval guided by Chain-of-Thought
 
-Overall idea seems pretty similar to the 
-[Enhancing Retrieval-Augmented Large Language Models with Iterative Retrieval-Generation Synergy]({% link _summaries/2024-10-03-enhancing-retrieval-augmented-large-language-models-with-iterative-retrieval-generation-synergy.markdown %})
-paper, except that instead of having a pre-defined number of iterations, the CoT-guided LLM decides when to stop.
+The paper's insight is to use CoT to guide retrieval, and use the retrieved
+contents to then guide CoT again.
+
+This is done as follows:
+
+- Generate one sentence of CoT
+- Use CoT sentence to retrieve additional piece of context
+- Using new context, repeat the previous steps until answer is provided, or reached max number of steps
+
+The retrieved context is ordered randomly at each step. As the LLM may output
+multiple sentences of CoT each time, they just take one newly generated sentence
+and drop the rest.
+
+Here's the overall structure of the prompt:
+
+```
+Wikipedia Title: <Page Title>
+<Paragraph Text>
+...
+Wikipedia Title: <Page Title>
+<Paragraph Text>
+Q: <Question>
+A: <CoT-Sent-1> ... <CoT-Sent-n>
+```
 
 {% include figure.html
     path="/assets/img/summaries/ir_cot_workflow.webp"
@@ -40,4 +61,4 @@ Marginal novelty given other techniques like self-ask that came previously.
 ### Conclusions for Future Work
 
 Could use a LLM to drive querying of RAG datastores guided by CoT to resolve
-complex queries.
\ No newline at end of file
+complex queries.