Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add dedicated prompt configuration for ML search resp processors #389

Merged
merged 4 commits into from
Sep 18, 2024

Conversation

ohltyler
Copy link
Member

@ohltyler ohltyler commented Sep 18, 2024

Description

Adds a dedicated section and modal for optionally configuring a prompt. Only exposing in the search response context, as that is the most likely place users will be setting it up. Can easily be enabled for other contexts if wanted later on. The dedicated modal has 3 main parts: 1/ dropdown with prompt template presets for users to select from, 2/ the user-editable text editor that displays the prompt, and 3/ the list of model inputs (and corresponding placeholder strings) that users can copy/paste, in order to insert these dynamic values into the template at runtime (this part is omitted if no input interface found). A common example of this, is passing returned documents as context in the prompt template for some LLM to answer a question with / summarize/ etc.

Implementation details:

  • adds prompt presets in constants. just 2 very basic ones for now
  • updates MLProcessorInputs to include a new section to configure the prompt and render the modal if clicked
  • new ConfigurePromptModal component to handle the user interactions and configuration as described above. any updates to the prompt will be reflected in the form, by updating the prompt sub-field under the model config form field.
  • minor: refactors the new modal and existing input/output modals into a standalone modals module
  • minor: adds a touched check when selecting a new query, so the save button is enabled immediately when changed (if nothing else is dirty)
  • minor: adds index config injection in the quick configure modal for RAG use case, so any configured text field, is set as a text field index mapping by default

Demo video, showing a new RAG use case with no prompt template. Shows configuration of the template using a preset, also including the dynamic context input to be injected into the template. The full example with injected inputs is available to view when configuring the input transform. When running, the results are as expected, containing a readable summary of the results found.

screen-capture.1.webm

Issues Resolved

Resolves #380

Check List

  • Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Copy link
Collaborator

@saimedhi saimedhi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you Tyler, Looks good to me.

@ohltyler ohltyler merged commit c1f517e into opensearch-project:main Sep 18, 2024
13 checks passed
@ohltyler ohltyler deleted the prompt-building branch September 18, 2024 20:25
opensearch-trigger-bot bot pushed a commit that referenced this pull request Sep 18, 2024
ohltyler added a commit that referenced this pull request Sep 18, 2024
… (#390)

Signed-off-by: Tyler Ohlsen <[email protected]>
(cherry picked from commit c1f517e)

Co-authored-by: Tyler Ohlsen <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Increase support for prompt template building and transformation
2 participants