Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using OpenAI Batch endpoint #753

Open
jimmy6DOF opened this issue Aug 14, 2024 · 1 comment
Open

Using OpenAI Batch endpoint #753

jimmy6DOF opened this issue Aug 14, 2024 · 1 comment

Comments

@jimmy6DOF
Copy link

jimmy6DOF commented Aug 14, 2024

Hello, I have some jobs that have zero time constraints but are cost sensitive so wondering how to integrate the batch endpoint(s) from OpenAI since there is already a lot of async waiting for model output maybe this could make sense for some if not all requests ?

Batch API -- Higher rate limits & 50% discounted Tokens

Edit: more work but again in the cost optimization bracket could be integrating the new Cashing endpoints ?

@ElishaKay
Copy link
Collaborator

ElishaKay commented Aug 16, 2024

@jimmy6DOF interesting idea for A PR.

We're thinking about how to make this process smoother.

Happy to hear more about relevant use cases

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants