Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model parameters option to pass in model tuning, arbitrary parameters #430

Merged
merged 8 commits into from
Nov 8, 2023

Conversation

3coins
Copy link
Collaborator

@3coins 3coins commented Nov 2, 2023

Fixes #361
Fixes #444

Description

This PR brings model_parameters as an additional option to pass in arbitrary values to the provider class during initialization. This option is available in both Chat UI and Magics.

This is useful for passing parameters such as model tuning that affect the response generation by the model.
This is also an appropriate place to pass in custom attributes required by certain providers/models.

The accepted value is a dictionary, with top level keys as the model id (provider:model_id), and value
should be any arbitrary dictionary which is unpacked and passed as-is to the provider class.

Configuring as a startup option

In this sample, the bedrock provider will be created with the value for model_kwargs when ai21.j2-mid-v1 model is selected.

jupyter lab --AiExtension.model_parameters {"bedrock:ai21.j2-mid-v1":{"model_kwargs":{"maxTokens":200}}}

The above will result in the following LLM class to be generated.

BedrockProvider(model_kwargs={"maxTokens":200}, ...)

Here is another example, where anthropic provider will be created with the values for max_tokens and temperature, when claude-2 model is selected.

jupyter lab --AiExtension.model_parameters {"anthropic:claude-2":{"max_tokens":1024,"temperature":0.9}}

The above will result in the following LLM class to be generated.

AnthropicProvider(max_tokens=1024, temperature=0.9, ...)

Configuring as a config file

This configuration can also be specified in a config file in json format. The config can be loaded by specifying the path to the config file.

jupyter lab --config <config.json>

Here is an example for configuring the bedrock provider for ai21.j2-mid-v1 model.

{
    "AiExtension": { 
        "model_parameters": {
            "bedrock:ai21.j2-mid-v1": {
                "model_kwargs": {
                    "maxTokens": 200
                }
            }
        }
    }
}

Magic command samples

%%ai sagemaker-endpoint:jumpstart-dft-meta-textgeneration-llama-2-7b -q {"inputs":"<prompt>","parameters":{"max_new_tokens":64,"top_p":0.9,"temperature":0.6,"return_full_text":false}} -n us-east-1 -p [0].generation -m {"endpoint_kwargs":{"CustomAttributes":"accept_eula=true"}} -f text
Translate English to French:
sea otter => loutre de mer
peppermint => menthe poivrée
plush girafe => girafe peluche
cheese =>
%%ai bedrock:ai21.j2-mid-v1 -m {"model_kwargs":{"maxTokens":256}} -f text
Write me a short story on Python language

@3coins 3coins added the enhancement New feature or request label Nov 2, 2023
@3coins 3coins changed the title Endpoint args for SM endpoints model_kwargs and endpoint_kwargs for Bedrock and SM Endpoint Nov 3, 2023
@3coins 3coins force-pushed the add-endpoint-args branch from 376b451 to 9149e1f Compare November 3, 2023 19:28
@3coins 3coins marked this pull request as ready for review November 3, 2023 19:28
@ellisonbg
Copy link
Contributor

Not sure this config should be in the UI. Might be better matched for traitlet's based config.

@3coins
Copy link
Collaborator Author

3coins commented Nov 6, 2023

@ellisonbg
Although, providing these options as traitlets will allow users to specify them at runtime, Jupyter AI currently doesn’t have a mechanism to tie these attributes with a particular provider/model; we need a way to specify complete settings config as a runtime input. Also, in certain enterprise environments, a user might not have an ability to control the runtime parameters or have the option to restart the server and supply these values. I realize that the UX as an input field for these options might not be ideal, but providing this option in the settings panel gives a path forward to the user in controlling the model response.

@andrii-i
Copy link
Collaborator

andrii-i commented Nov 6, 2023

Was only able to test UI so far (don't have Bedrock access yet), it looks and works well, new fields are shown, format verification on them works.

dlqqq

This comment was marked as outdated.

@3coins 3coins force-pushed the add-endpoint-args branch from 9149e1f to aab2f6f Compare November 8, 2023 02:58
@3coins 3coins changed the title model_kwargs and endpoint_kwargs for Bedrock and SM Endpoint Model parameters option to pass in model tuning, arbitrary parameters Nov 8, 2023
@3coins 3coins self-assigned this Nov 8, 2023
@3coins 3coins added the bug Something isn't working label Nov 8, 2023
@3coins 3coins force-pushed the add-endpoint-args branch from 0cad9d0 to 486b2fe Compare November 8, 2023 17:33
@3coins 3coins force-pushed the add-endpoint-args branch from 227bef0 to 35bd907 Compare November 8, 2023 17:35
docs/source/users/index.md Outdated Show resolved Hide resolved
@dlqqq dlqqq mentioned this pull request Nov 8, 2023
* log configured model_parameters

* fix markdown formatting in docs

* fix single quotes and use preferred traitlets CLI syntax
@3coins 3coins merged commit 1fecfea into jupyterlab:main Nov 8, 2023
5 of 6 checks passed
@JasonWeill
Copy link
Collaborator

@meeseeksdev please backport to 1.x

Copy link

lumberbot-app bot commented Nov 8, 2023

Owee, I'm MrMeeseeks, Look at me.

There seem to be a conflict, please backport manually. Here are approximate instructions:

  1. Checkout backport branch and update it.
git checkout 1.x
git pull
  1. Cherry pick the first parent branch of the this PR on top of the older branch:
git cherry-pick -x -m1 1fecfea6556501212ac9b7309edb7f0c98c69618
  1. You will likely have some merge/cherry-pick conflict here, fix them and commit:
git commit -am 'Backport PR #430: Model parameters option to pass in model tuning, arbitrary parameters'
  1. Push to a named branch:
git push YOURFORK 1.x:auto-backport-of-pr-430-on-1.x
  1. Create a PR against branch 1.x, I would have named this PR:

"Backport PR #430 on branch 1.x (Model parameters option to pass in model tuning, arbitrary parameters)"

And apply the correct labels and milestones.

Congratulations — you did some good work! Hopefully your backport PR will be tested by the continuous integration and merged soon!

Remember to remove the Still Needs Manual Backport label once the PR gets merged.

If these instructions are inaccurate, feel free to suggest an improvement.

JasonWeill pushed a commit to JasonWeill/jupyter-ai that referenced this pull request Nov 8, 2023
JasonWeill added a commit that referenced this pull request Nov 8, 2023
dbelgrod pushed a commit to dbelgrod/jupyter-ai that referenced this pull request Jun 10, 2024
…jupyterlab#430)

* Endpoint args for SM endpoints

* Added model and endpoints kwargs options.

* Added configurable option for model parameters.

* Updated magics, added model_parameters, removed model_kwargs and endpoint_kwargs.

* Fixes %ai error for SM endpoints.

* Fixed docs

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* 430 fixes (jupyterlab#2)

* log configured model_parameters

* fix markdown formatting in docs

* fix single quotes and use preferred traitlets CLI syntax

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: david qiu <[email protected]>
Marchlak pushed a commit to Marchlak/jupyter-ai that referenced this pull request Oct 28, 2024
…jupyterlab#430)

* Endpoint args for SM endpoints

* Added model and endpoints kwargs options.

* Added configurable option for model parameters.

* Updated magics, added model_parameters, removed model_kwargs and endpoint_kwargs.

* Fixes %ai error for SM endpoints.

* Fixed docs

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* 430 fixes (jupyterlab#2)

* log configured model_parameters

* fix markdown formatting in docs

* fix single quotes and use preferred traitlets CLI syntax

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: david qiu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
5 participants