Berri AI is a python package that helps developers quickly and easily deploy their LLM App from Google Colab directly to production. Just install the package, import the function, and run deploy. At the end of the deploy (~10-15mins), you will get:
- 🎉 A web app to interact with your agent 👉 example
- 😱 An endpoint you can query 👉
https://agent-repo-35aa2cf3-a0a1-4cf8-834f-302e5b7fe07e-45247-8aqi.zeet-team-ishaan-jaff.zeet.app/langchain_agent?query="who is obama?"
Berri AI is built specifically for the Agent class of Langchain framework, a popular Python framework for building LLM Apps.
There are 4 major ways you can use Berri
- Pipelines: Best way to get started. Pipelines let you spin up an LLM App in 2-lines of code.
- GPT-Index: If you're writing an LLM app with the primary method of interaction being gpt-index's .query() function.
- Langchain: If you're writing an LLM app with the primary method of interaction being Langchain's 'initialize_agent' or 'AgentExecutor()' functions
- Wrapper functions: For more complex use-cases. If you're taking a user query and doing multiple things (LLM calls, api calls, etc.) with it, you can put them in a wrapper function and pass the wrapper function to Berri.
Today we support 1 pipeline - docQAPipeline
: This lets you paste a url link to your documentation and get a shareable web app to use it, in 15mins. We'll handle the chunking, vectorizing, agent initialization, deployment for you.
-
Install the package:
pip install berri-ai
-
Import the deploy method:
from berri_ai import docQAPipeline
-
Initiate the deployment by providing your email address:
docQAPipeline(user_email, open_ai_key, input_url) # example docQAPipeline(user_email="[email protected]", open_ai_key="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", input_url="https://stripe.com/docs/india-accept-international-payments#TransactionPurposeCode")
go here to get your openai api key
To use Berri AI w/ GPT-Index, follow these steps:
-
Install the package:
pip install berri-ai
-
Import the deploy method:
from berri_ai import deploy_gpt_index
-
Initiate the deployment by providing your email address:
deploy_gpt_index(user_email=<your email>) # example deploy_gpt_index(user_email="[email protected]")
Note: Today, Berri will only look for the '.query()' function. Let us know if there are other use-cases you would like us to support.
To use Berri AI w/ Langchain, follow these steps:
-
Install the package:
pip install berri-ai
-
Import the deploy method:
from berri_ai import deploy
-
Initiate the deployment by providing your email address:
deploy(user_email=<your email>) # example deploy(user_email="[email protected]")
Note: Today, berri will look for the initialize_agent() and AgentExecutor() functions in your code. If you're using another way of initializing your agent, let us know and we'll update the package to account for that.
Once deployment is complete, you will receive an email notification. The entire process usually takes 10-15 minutes.
To use Berri AI w/ Wrapper Functions, follow these steps:
-
Install the package:
pip install berri-ai
-
Import the deploy method:
from berri_ai import deploy_func
-
Initiate the deployment by providing your email address:
deploy_func(user_email=<your email>, executing_function=<stringified name of your executing function>, test_str=<test_user_input_query>) # example deploy_func(user_email="[email protected]", executing_function="print_answer", test_str="what is ManimML?")
- Berri LangChain Youtube Agent Example
- Berri LangChain Search Agent Example
- Berri GPT Index + Langchain Document QA Example
If you have any questions or need help using Berri AI, join the Discord or Text/WhatsApp us @ +17708783106 📱.