-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changing Prompt of Accumulate Response Synthesizer. #14483
Comments
To modify the Here is an example of how you might modify the from llama_index.legacy.response_synthesizers.base import BaseSynthesizer
class Accumulate(BaseSynthesizer):
def __init__(self, text_qa_template=None, service_context=None, output_cls=None, streaming=False, use_async=False):
super().__init__(service_context=service_context, streaming=streaming)
self._text_qa_template = text_qa_template or DEFAULT_TEXT_QA_PROMPT_SEL
self._use_async = use_async
self._output_cls = output_cls
def synthesize(self, documents):
responses = []
for doc in documents:
if not doc.context: # Check if context is provided
continue # Skip the document if no context is provided
response = self._service_context.llm.predict(self._text_qa_template, context_str=doc.context)
responses.append(response)
return self._accumulate_responses(responses)
def _accumulate_responses(self, responses):
# Implement the logic to accumulate responses
return " ".join(responses) In this example, the |
Can we change prompt instead of doing this? |
Yes, you can change the prompt used by the accumulate response synthesizer to skip generating a response for documents that don't provide context for the query without modifying the Here is an example of how you might modify the prompt template: from llama_index.core.prompts.base import PromptTemplate
from llama_index.core.response_synthesizers import get_response_synthesizer, ResponseMode
from llama_index.core.service_context import ServiceContext
class CustomPromptTemplate(PromptTemplate):
def partial_format(self, query_str: str) -> str:
# Custom logic to check if the document provides context for the query
return f"""
{{% if context_str contains '{query_str}' %}}
Answer the following question based on the context: {query_str}
{{% else %}}
No relevant context found.
{{% endif %}}
"""
# Use the custom prompt template in the Accumulate synthesizer
custom_prompt_template = CustomPromptTemplate("{context_str}{query_str}", prompt_type="QUESTION_ANSWER")
service_context = ServiceContext.from_defaults()
accumulate_synthesizer = get_response_synthesizer(
service_context=service_context,
text_qa_template=custom_prompt_template,
response_mode=ResponseMode.ACCUMULATE,
)
# Example usage
query_str = "What is?"
texts = [
"This is bar",
"This is foo",
]
response = accumulate_synthesizer.get_response(text_chunks=texts, query_str=query_str)
print(response) This custom prompt template uses a conditional statement to check if the |
how can I call the update_prompts method and pass the prompt in Accumulate. |
To call the
Here is an example: from llama_index.core.prompts.base import PromptTemplate
from llama_index.core.response_synthesizers.accumulate import Accumulate
# Define your custom prompt template
custom_prompt_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query in a detailed manner.\n"
"Query: {query_str}\n"
"Answer: "
)
custom_prompt = PromptTemplate(custom_prompt_str)
# Update the prompt in the Accumulate response synthesizer
accumulate_synthesizer = Accumulate()
accumulate_synthesizer.update_prompts(
{"text_qa_template": custom_prompt}
) This code snippet defines a custom prompt template and updates the |
Question Validation
Question
I was using the accumulate response synthesizer and wanted to know if I could change the prompt of the same. Basically, the accumulate generates many responses for the retrieved chunks of data. Now if the document doesn't provide the context for the query written, Then it writes as "No context is provided in the document [Document Name or something], Now Instead can we change the function not to write those things in the final output and just skip the document if no context I provided, and skip the response from that document.
The text was updated successfully, but these errors were encountered: