Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #149: Add Temperature and Max Tokens Configuration #176

Open
wants to merge 6 commits into
base: staging
Choose a base branch
from

Conversation

ahmad2b
Copy link
Contributor

@ahmad2b ahmad2b commented Nov 4, 2024

Adds temperature and max tokens controls to artifact generation nodes.

CHANGES

  • Dynamic max tokens slider based on model limits
  • Temperature slider (0-1 or 0-2)
  • Default values set in constants.ts for each model

IMPLEMENTED IN NODES

  • generateArtifact
  • updateArtifact
  • updateHighlightedText
  • rewriteArtifact
  • rewriteArtifactTheme
  • rewriteCodeArtifactTheme
  • customAction
  • replyToGeneralInput

Please let me know if any changes or additional information is needed.

Fixes #149

{56594BC5-6B03-4DF6-AA01-0637F3B30D36}

Copy link

vercel bot commented Nov 4, 2024

@ahmad2b is attempting to deploy a commit to the LangChain Team on Vercel.

A member of the Team first needs to authorize it.

@bracesproul bracesproul changed the base branch from main to staging November 5, 2024 20:19
Copy link
Collaborator

@bracesproul bracesproul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this, everything looks good to start! There does seem to be an issue around config options not being specific to the provider. E.g if I change the max tokens of claude, it changes the max tokens of the oai model. These should be completely independent.

Also, in the future please open PRs against the staging branch, and not main.

Let me know when this has been resolved! Thank you!

@ahmad2b
Copy link
Contributor Author

ahmad2b commented Nov 5, 2024

Thanks for the thorough review! I will separate the config state between providers to make settings fully independent.

Noted on using using the staging branch - will follow that for all future PRs. I'll push the fix soon!

@ahmad2b
Copy link
Contributor Author

ahmad2b commented Nov 7, 2024

@bracesproul I've implemented the requested changes to make model configurations provider-specific:

  1. Each model now maintains its own independent config state using a Record-based approach
  2. Implemented initialization with default configs for each model
  3. Fixed cross-provider setting interactions
  4. Additionally added reset functionality

The config modifications are now isolated per provider, with the ability to reset individual model settings back to their defaults as shown in the video below.

Changes have been pushed and are ready for another review. Let me know if you need any adjustments!

feedback.ed.mp4

@bracesproul
Copy link
Collaborator

Hey @ahmad2b could you fix these merge conflicts? Once done I can review. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add settings config for temperature, max tokens
2 participants