Use CodeActAgent-Mistral-7b by gradio as runable Code-LLM agent.🤖
Executable Code Actions Elicit Better LLM Agents is a repo that use executable code
to consolidate LLM agents’ actions into a unified action space (CodeAct). Integrated with a Python interpreter.
CodeAct can execute code actions and dynamically revise prior actions or emit new actions upon new observations
(e.g., code execution results) through multi-turn interactions.
The original repo, use Huggingface ChatUi and Has a relatively complex front-end and back-end structure.
This project aims to simplify the original project structure, by simply using Llama-cpp and gradio.
Starting from quantified weights, it implements the use of CodeActAgent as a runnable and interactive teaching Code LLM function with a relatively simple structure.
pip install -r requirements.txt
It is recommended to install the llama-cpp-python GPU version for a better experience.
Lanuch the jupyter notebook named "code_act_agent_gradio_demo.ipynb", run every shell of the notebook.
Visit http://127.0.0.1:7860 in browser or public url provided by gradio.
Below, use some videos to introduce methods of interacting with models.
- 1 divide number function:
Give me a python function give the divide of number it self 10 times.
div_512.mp4
"Observation:" in the chat context indicate the run conclusion of function, defined by LLM. This shows the runable and interactive ability of demo.
- 2 teach numpy:
teach me how to use numpy.
div_vec.mp4
- 1 image download function by pollinations.ai: (can retrieve image in stable diffusion style)
Write a python code about, download image to local from url, the format as :
url = f'https://image.pollinations.ai/prompt/{prompt}'
where prompt as the input of download function.
bee.mp4
In this example, user can define the download_image function and download the bee image into local.
And the LLM have ability to correct the error output make by itself.
When agent save image without extension name, user can modify the extension of local files using a natural language command program,
which indicate the LLM have agent ability rather than just a teacher.
- 1 simple box plot
Plot box plot with pandas and save it to local.
boxplot.mp4
- 2 linear regression principle and data plot
Draw a picture teach me what linear regression is.
reg.mp4
- 3 financial transaction process simulation
Write a piece of Python code to simulate the financial transaction process and draw a financial images chart by lineplot of Poisson process.
possion.mp4
- 1 Because of the randomness, the results of each run may be different, which encourages active exploration of more ways to flexibly interact with LLM, which is also more interesting.
- 2 the example in gradio page provide some convenient instructions are provided to facilitate interaction with the model, such as
Give me the function defination. 💡
Correct it.☹️ ❌
Save the output as image 🖼️ to local. ⏬
Good Job 😊
You can find the usage of them in above videos. - 3 the max length of total chat context set to 3060 in the notebook, if you require more rounds in chat, try to increase it.
- 4 I recommand you run the demo on GPU (10GB gpu memory is enough, all examples have tested on single GTX 1080Ti or GTX 3060)
Name | Type | HuggingFace Model link |
---|---|---|
xingyaoww/CodeActAgent-Mistral-7b-v0.1 | Mistral-7b 8bit quantization | https://huggingface.co/xingyaoww/CodeActAgent-Mistral-7b-v0.1 |
svjack - https://huggingface.co/svjack - [email protected] - [email protected]
Project Link:https://github.com/svjack/CodeActAgent-Gradio