Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: Refactor for core SDK #1092

Merged
merged 4 commits into from
Jan 21, 2024
Merged

refactor: Refactor for core SDK #1092

merged 4 commits into from
Jan 21, 2024

Conversation

fangyinc
Copy link
Collaborator

@fangyinc fangyinc commented Jan 19, 2024

Description

Release core SDK.

How Has This Been Tested?

Currently uploaded to testpypi, you can try it with:

pip install -i https://test.pypi.org/simple/ dbgpt --upgrade

Example 1:

First, install necessary dependencies:

pip install -i https://test.pypi.org/simple/ dbgpt --upgrade
pip install openai

Create a python file simple_sdk_llm_example_dag.py and write the following content:

# simple_sdk_llm_example_dag.py

from dbgpt.core import BaseOutputParser
from dbgpt.core.awel import DAG
from dbgpt.core.operator import (
    PromptBuilderOperator,
    RequestBuilderOperator,
)
from dbgpt.model.proxy import OpenAILLMClient
from dbgpt.model.operator import LLMOperator

with DAG("simple_sdk_llm_example_dag") as dag:
    prompt_task = PromptBuilderOperator(
        "Write a SQL of {dialect} to query all data of {table_name}."
    )
    model_pre_handle_task = RequestBuilderOperator(model="gpt-3.5-turbo")
    llm_task = LLMOperator(OpenAILLMClient())
    out_parse_task = BaseOutputParser()
    prompt_task >> model_pre_handle_task >> llm_task >> out_parse_task

Run this python script

export OPENAI_API_KEY=sk-xx
export OPENAI_API_BASE=https://xx:80/v1

python simple_sdk_llm_example_dag.py

Example 2:

First, install necessary dependencies:

pip install -i https://test.pypi.org/simple/ dbgpt --upgrade
pip install fastapi uvicorn openai

Create a python file simple_chat_dag_example.py and write the following content:

# simple_chat_dag_example.py

from dbgpt._private.pydantic import BaseModel, Field
from dbgpt.core import ModelMessage, ModelRequest
from dbgpt.core.awel import DAG, HttpTrigger, MapOperator
from dbgpt.model.operator import LLMOperator


class TriggerReqBody(BaseModel):
    model: str = Field(..., description="Model name")
    user_input: str = Field(..., description="User input")


class RequestHandleOperator(MapOperator[TriggerReqBody, ModelRequest]):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)

    async def map(self, input_value: TriggerReqBody) -> ModelRequest:
        messages = [ModelMessage.build_human_message(input_value.user_input)]
        print(f"Receive input value: {input_value}")
        return ModelRequest.build_request(input_value.model, messages)


with DAG("dbgpt_awel_simple_dag_example") as dag:
    # Receive http request and trigger dag to run.
    trigger = HttpTrigger(
        "/examples/simple_chat", methods="POST", request_body=TriggerReqBody
    )
    request_handle_task = RequestHandleOperator()
    llm_task = LLMOperator(task_name="llm_task")
    model_parse_task = MapOperator(lambda out: out.to_dict())
    trigger >> request_handle_task >> llm_task >> model_parse_task


if __name__ == "__main__":
    if dag.leaf_nodes[0].dev_mode:
        # Development mode, you can run the dag locally for debugging.
        from dbgpt.core.awel import setup_dev_environment

        setup_dev_environment([dag], port=5555)
    else:
        # Production mode, DB-GPT will automatically load and execute the current file after startup.
        pass

Run this python script

export OPENAI_API_KEY=sk-xx
export OPENAI_API_BASE=https://xx:80/v1

python simple_chat_dag_example.py

Test HTTP API

DBGPT_SERVER="http://127.0.0.1:5555"
MODEL="gpt-3.5-turbo"

curl -X POST $DBGPT_SERVER/api/v1/awel/trigger/examples/simple_chat \
-H "Content-Type: application/json" -d '{
    "model": "'"$MODEL"'",
    "user_input": "hello"
}'

Snapshots:

Include snapshots for easier review.

Checklist:

  • My code follows the style guidelines of this project
  • I have already rebased the commits and make the commit message conform to the project standard.
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • Any dependent changes have been merged and published in downstream modules

@github-actions github-actions bot added the internal DB-GPT internal flag(chore|ci|refactor|test) label Jan 19, 2024
csunny
csunny previously approved these changes Jan 20, 2024
Copy link
Collaborator

@csunny csunny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

r+

Copy link
Collaborator

@Aries-ckt Aries-ckt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@csunny csunny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

r+

@csunny csunny merged commit 2d90519 into eosphoros-ai:main Jan 21, 2024
5 checks passed
@fangyinc fangyinc deleted the sdk-dev branch January 22, 2024 02:09
Hopshine pushed a commit to Hopshine/DB-GPT that referenced this pull request Sep 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
internal DB-GPT internal flag(chore|ci|refactor|test)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants