![]() |
Official implementation of MLZero: A Multi-Agent System for End-to-end Machine Learning Automation
AutoGluon Assistant (aka MLZero) is a multi-agent system that automates end-to-end multimodal machine learning or deep learning workflows by transforming raw multimodal data into high-quality ML solutions with zero human intervention. Leveraging specialized perception agents, dual-memory modules, and iterative code generation, it handles diverse data formats while maintaining high success rates across complex ML tasks.
AutoGluon Assistant is supported on Python 3.8 - 3.11 and is available on Linux (will fix dependency issues for MacOS and Windows by our next official release).
You can install from source (new version will be released to PyPI soon):
pip install uv
uv pip install git+https://github.com/autogluon/autogluon-assistant.git
For detailed usage instructions, Anthropic/Azure/OpenAI setup, and advanced configuration options, see our Getting Started Tutorial.
MLZero uses AWS Bedrock by default. Configure your AWS credentials:
export AWS_DEFAULT_REGION="<your-region>"
export AWS_ACCESS_KEY_ID="<your-access-key>"
export AWS_SECRET_ACCESS_KEY="<your-secret-key>"
We also support Anthropic, Azure, and OpenAI. Support for more LLM providers (e.g. DeepSeek, etc.) will be added soon.
mlzero -i <input_data_folder> [-t <optional_user_instructions>]
mlzero-backend # command to start backend
mlzero-frontend # command to start frontend on 8509(default)
- Configure: Set your model provider and credentials in settings
- Upload & Describe: Drag your data folder into the chat input box, then type what you want to accomplish and press Enter
Note: The system can run on a single machine or distributed across multiple machines (e.g., server on EC2, client on local).
- Start the server
mlzero-backend # command to start backend
mlzero-mcp-server # This will start the service—run it in a new terminal.
- Start the client
mlzero-mcp-client
Note: You may need to set up port tunneling to expose your local MCP Client Server (port 8005) if you want to use it with remote LLM services (e.g., Claude API, OpenAI API).
from autogluon.assistant.coding_agent import run_agent
run_agent(
input_data_folder=<your-input-folder>,
output_folder=<your-output-folder>,
# more args ...
)
If you use Autogluon Assistant (MLZero) in your research, please cite our paper:
@misc{fang2025mlzeromultiagentendtoendmachine,
title={MLZero: A Multi-Agent System for End-to-end Machine Learning Automation},
author={Haoyang Fang and Boran Han and Nick Erickson and Xiyuan Zhang and Su Zhou and Anirudh Dagar and Jiani Zhang and Ali Caner Turkmen and Cuixiong Hu and Huzefa Rangwala and Ying Nian Wu and Bernie Wang and George Karypis},
year={2025},
eprint={2505.13941},
archivePrefix={arXiv},
primaryClass={cs.MA},
url={https://arxiv.org/abs/2505.13941},
}