Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm inference plugin #2967

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

dansola
Copy link
Contributor

@dansola dansola commented Nov 29, 2024

Why are the changes needed?

A vllm addition to the existing flytekitplugins-inference plugin which already has NIM and ollama.

What changes were proposed in this pull request?

A vllm plugin that lets you easily create a pod template to serve a vllm in an init container for a flyte task. User passes a hugging face secret name and the model in hugging face they want to serve.

import flytekit as fl
from openai import OpenAI

model_name = "google/gemma-2b-it"
hf_token_key = "vllm_hf_token"

vllm_args = {
    "model": model_name,
    "dtype": "half",
    "max-model-len": 2000,
}

hf_secrets = HFSecret(
    secrets_prefix="_FSEC_",
    hf_token_key=hf_token_key
)

vllm_instance = VLLM(
    hf_secret=hf_secrets,
    arg_dict=vllm_args
)

image = fl.ImageSpec(
    name="vllm_serve",
    registry="...",
    packages=["flytekitplugins-inference"],
)


@fl.task(
    pod_template=vllm_instance.pod_template,
    container_image=image,
    secret_requests=[
        fl.Secret(
            key=hf_token_key, mount_requirement=fl.Secret.MountType.ENV_VAR  # must be mounted as an env var
        )
    ],
)
def model_serving() -> str:
    client = OpenAI(
        base_url=f"{vllm_instance.base_url}/v1", api_key="vllm"  # api key required but ignored
    )

    completion = client.chat.completions.create(
        model=model_name,
        messages=[
            {
                "role": "user",
                "content": "Compose a haiku about the power of AI.",
            }
        ],
        temperature=0.5,
        top_p=1,
        max_tokens=1024,
    )
    return completion.choices[0].message.content

How was this patch tested?

Unit tests and running a remote workflow from the README.

Setup process

Screenshots

Check all the applicable boxes

  • I updated the documentation accordingly.
  • All new and existing tests passed.
  • All commits are signed-off.

Related PRs

Docs link

Signed-off-by: Daniel Sola <[email protected]>
Copy link

codecov bot commented Nov 29, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 76.45%. Comparing base (e24b9c1) to head (acb1639).
Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2967      +/-   ##
==========================================
+ Coverage   75.71%   76.45%   +0.73%     
==========================================
  Files         214      200      -14     
  Lines       21598    20922     -676     
  Branches     2693     2694       +1     
==========================================
- Hits        16352    15995     -357     
+ Misses       4489     4202     -287     
+ Partials      757      725      -32     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Future-Outlier
Copy link
Member

This is huge!

Signed-off-by: Daniel Sola <[email protected]>
mem: str = "10Gi",
):
"""
Initialize NIM class for managing a Kubernetes pod template.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Initialize NIM class for managing a Kubernetes pod template.
Initialize VLLM class for managing a Kubernetes pod template.

Copy link
Contributor

@samhita-alla samhita-alla left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lovely!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants