Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: assistant function call when multple tool are needed in the same… #4483

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions core/api/src/domain/support/errors.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ export class UnknownPineconeError extends SupportError {
export class ChatAssistantError extends SupportError {}
export class ChatAssistantNotFoundError extends SupportError {}

export class TimeoutAssistantError extends SupportError {}

export class UnknownChatAssistantError extends ChatAssistantError {
level = ErrorLevel.Critical
}
1 change: 1 addition & 0 deletions core/api/src/graphql/error-map.ts
Original file line number Diff line number Diff line change
Expand Up @@ -823,6 +823,7 @@ export const mapError = (error: ApplicationError): CustomGraphQLError => {
case "UnknownPineconeError":
case "CallbackServiceError":
case "ChatAssistantNotFoundError":
case "TimeoutAssistantError":
message = `Unknown error occurred (code: ${error.name})`
return new UnknownClientError({ message, logger: baseLogger })

Expand Down
68 changes: 40 additions & 28 deletions core/api/src/services/openai/assistant.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ import { sleep } from "@/utils"
import { UnknownDomainError } from "@/domain/shared"
import {
ChatAssistantNotFoundError,
TimeoutAssistantError,
UnknownChatAssistantError,
} from "@/domain/support/errors"

Expand Down Expand Up @@ -134,30 +135,36 @@ export const Assistant = (): ChatAssistant => {
}
}

const processAction = async (run: OpenAI.Beta.Threads.Runs.Run) => {
const processAction = async (run: OpenAI.Beta.Threads.Runs.Run): Promise<string[]> => {
const action = run.required_action
assert(action?.type === "submit_tool_outputs")

const name = action.submit_tool_outputs.tool_calls[0].function.name
assert(name === "queryBlinkKnowledgeBase")
const outputs: string[] = []

const args = action.submit_tool_outputs.tool_calls[0].function.arguments
const query = JSON.parse(args).query_str
for (const toolCall of action.submit_tool_outputs.tool_calls) {
const name = toolCall.function.name
assert(name === "queryBlinkKnowledgeBase")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not a blocker but this is not a standard way to throw an exception. if we want to use it please catch properly AssertionError in the catch block


const vector = await textToVector(query)
if (vector instanceof Error) throw vector
const args = toolCall.function.arguments
const query = JSON.parse(args).query_str

const relatedQueries = await retrieveRelatedQueries(vector)
if (relatedQueries instanceof Error) throw relatedQueries
const vector = await textToVector(query)
if (vector instanceof Error) throw vector

let output = ""
let i = 0
for (const query of relatedQueries) {
output += `Context chunk ${i}:\n${query}\n-----\n`
i += 1
const relatedQueries = await retrieveRelatedQueries(vector)
if (relatedQueries instanceof Error) throw relatedQueries

let output = ""
let i = 0
for (const query of relatedQueries) {
output += `Context chunk ${i}:\n${query}\n-----\n`
i += 1
}

outputs.push(output)
}

return output
return outputs
}

const waitForCompletion = async ({
Expand All @@ -166,8 +173,11 @@ export const Assistant = (): ChatAssistant => {
}: {
runId: string
threadId: string
}) => {
}): Promise<true | ChatAssistantError> => {
let run: OpenAI.Beta.Threads.Runs.Run
const maxRetries = 60 // Assuming a 30-second timeout with 500ms sleep
let retries = 0

try {
run = await openai.beta.threads.runs.retrieve(threadId, runId)
} catch (err) {
Expand All @@ -177,32 +187,34 @@ export const Assistant = (): ChatAssistant => {
while (
["queued", "in_progress", "cancelling", "requires_action"].includes(run.status)
) {
// TODO: max timer for this loop
// add open telemetry here? or is it already present with the http requests?
if (retries >= maxRetries) {
return new TimeoutAssistantError()
}

// Add telemetry here if needed
await sleep(500)
retries += 1

try {
run = await openai.beta.threads.runs.retrieve(threadId, runId)
} catch (err) {
return new UnknownChatAssistantError(err)
}

if (run.status === "requires_action") {
let output: string
let outputs: string[]
try {
output = await processAction(run)
outputs = await processAction(run)
} catch (err) {
return new UnknownChatAssistantError(err)
}

try {
await openai.beta.threads.runs.submitToolOutputs(threadId, runId, {
tool_outputs: [
{
tool_call_id: run.required_action?.submit_tool_outputs.tool_calls[0].id,
output,
},
],
tool_outputs: outputs.map((output, index) => ({
tool_call_id: run.required_action?.submit_tool_outputs.tool_calls[index].id,
output,
})),
})
} catch (err) {
return new UnknownChatAssistantError(err)
Expand All @@ -222,12 +234,12 @@ export const Assistant = (): ChatAssistant => {
const responseThread = messages.data[0]

if (responseThread.content[0]?.type !== "text") {
return new UnknownChatAssistantError("last message is not text")
return new UnknownChatAssistantError("Last message is not text")
}

return true
} else {
return new UnknownChatAssistantError("issue running the assistant")
return new UnknownChatAssistantError("Issue running the assistant")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

else block is not necessary

}
}

Expand Down
Loading