-
Notifications
You must be signed in to change notification settings - Fork 376
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes #149: Add Temperature and Max Tokens Configuration #176
base: staging
Are you sure you want to change the base?
Conversation
@ahmad2b is attempting to deploy a commit to the LangChain Team on Vercel. A member of the Team first needs to authorize it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this, everything looks good to start! There does seem to be an issue around config options not being specific to the provider. E.g if I change the max tokens of claude, it changes the max tokens of the oai model. These should be completely independent.
Also, in the future please open PRs against the staging
branch, and not main.
Let me know when this has been resolved! Thank you!
Thanks for the thorough review! I will separate the config state between providers to make settings fully independent. Noted on using using the |
@bracesproul I've implemented the requested changes to make model configurations provider-specific:
The config modifications are now isolated per provider, with the ability to reset individual model settings back to their defaults as shown in the video below. Changes have been pushed and are ready for another review. Let me know if you need any adjustments! feedback.ed.mp4 |
Hey @ahmad2b could you fix these merge conflicts? Once done I can review. Thanks! |
Adds temperature and max tokens controls to artifact generation nodes.
CHANGES
IMPLEMENTED IN NODES
Please let me know if any changes or additional information is needed.
Fixes #149