You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
In the code below from the script src/api/query_openai_over_tasks.py :
if task['input_prompt_code'] in cache:
logging.info(
f"Task {task_idx} > Using cached result for {task['input_prompt_code']}")
codex_response = cache[task['input_prompt_code']]["codex_response"]
else:
codex_response = query_codex(task, prompt_text, engine, max_tokens=max_tokens)
completed_code = get_completed_code(task, codex_response)
graph = converter.python_to_graph(completed_code)
it seems the converter is applied only after the response is generated from the openai. Can you please verify whether this is correct? It seems the prompts are not getting changed when different prompting schemes are applied before calling openai.
The text was updated successfully, but these errors were encountered:
Hi,
In the code below from the script
src/api/query_openai_over_tasks.py
:it seems the converter is applied only after the response is generated from the openai. Can you please verify whether this is correct? It seems the prompts are not getting changed when different prompting schemes are applied before calling openai.
The text was updated successfully, but these errors were encountered: