Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Adding responseFormat parameter in OpenAI Chat Completion Request #2322

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

FarrukhMasud
Copy link
Contributor

OpenAI chat completion accepts response_format as request parameter. Setting this parameter as json_objectguarantees that response from OpenAi service will be in JSON format. It does also require that one of the prompt must also have the word JSON in it.

What changes are proposed in this pull request?

In this PR, we are adding a new field in request, which can be json_object or text as prescribed by OpenAI. In case it is set as json_object, then we also add a system prompt instructing openAI to respond in JSON format.

Unit tests have been added to validate this functionality

How is this patch tested?

  • I have written tests (not required for typo or doc fix) and confirmed the proposed feature/bug-fix/change works.

Does this PR change any dependencies?

  • No. You can skip this section.
  • Yes. Make sure the dependencies are resolved correctly, and list changes here.

Does this PR add a new feature? If so, have you added samples on website?

  • No. You can skip this section.
  • Yes. Make sure you have added samples following below steps.

OpenAI chat completion accepts `response_format` as request parameter. Setting this parameter as `json_object`guarantees that response from OpenAi service will be in JSON format. It does also require that one of the prompt must also have the word JSON in it. In this PR, we are adding a new field in request, which can be `json_object` or `text` as prescribed by OpenAI. In case it is set as `json_object`, then we also add a system prompt instructing openAI to respond in JSON format.

Unit tests have been added to validate this functionality
@FarrukhMasud
Copy link
Contributor Author

/azp run

Copy link

Azure Pipelines could not run because the pipeline triggers exclude this branch/path.

@mhamilton723
Copy link
Collaborator

/azp run

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@codecov-commenter
Copy link

codecov-commenter commented Dec 2, 2024

Codecov Report

Attention: Patch coverage is 95.00000% with 2 lines in your changes missing coverage. Please review.

Project coverage is 83.25%. Comparing base (4a6a041) to head (efb9b4d).

Files with missing lines Patch % Lines
...zure/synapse/ml/services/openai/OpenAIPrompt.scala 50.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2322      +/-   ##
==========================================
+ Coverage   83.18%   83.25%   +0.06%     
==========================================
  Files         328      328              
  Lines       16786    16824      +38     
  Branches     1501     1520      +19     
==========================================
+ Hits        13964    14006      +42     
+ Misses       2822     2818       -4     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@mhamilton723
Copy link
Collaborator

/azp run

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants