Skip to content

Commit

Permalink
add summary
Browse files Browse the repository at this point in the history
  • Loading branch information
fanpu committed Oct 23, 2024
1 parent 3aea896 commit 509820f
Showing 1 changed file with 25 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,30 @@ bib_id: 2212.10509v2

#### 1. Interleaved Retrieval guided by Chain-of-Thought

Overall idea seems pretty similar to the
[Enhancing Retrieval-Augmented Large Language Models with Iterative Retrieval-Generation Synergy]({% link _summaries/2024-10-03-enhancing-retrieval-augmented-large-language-models-with-iterative-retrieval-generation-synergy.markdown %})
paper, except that instead of having a pre-defined number of iterations, the CoT-guided LLM decides when to stop.
The paper's insight is to use CoT to guide retrieval, and use the retrieved
contents to then guide CoT again.

This is done as follows:

- Generate one sentence of CoT
- Use CoT sentence to retrieve additional piece of context
- Using new context, repeat the previous steps until answer is provided, or reached max number of steps

The retrieved context is ordered randomly at each step. As the LLM may output
multiple sentences of CoT each time, they just take one newly generated sentence
and drop the rest.

Here's the overall structure of the prompt:

```
Wikipedia Title: <Page Title>
<Paragraph Text>
...
Wikipedia Title: <Page Title>
<Paragraph Text>
Q: <Question>
A: <CoT-Sent-1> ... <CoT-Sent-n>
```

{% include figure.html
path="/assets/img/summaries/ir_cot_workflow.webp"
Expand Down Expand Up @@ -40,4 +61,4 @@ Marginal novelty given other techniques like self-ask that came previously.
### Conclusions for Future Work

Could use a LLM to drive querying of RAG datastores guided by CoT to resolve
complex queries.
complex queries.

0 comments on commit 509820f

Please sign in to comment.