This repository is crafted with reference to the implementations of mlc_llm and mlc_imp. It supports the conversion of models such as Llama, Gemma, and also LLaVA, and includes the necessary implementations for processing these models. Future updates will include support for a broader range of foundational models.
You can get the apk file in Google Drive link.
The instructions below are written for Mac. If you are using a different OS, please adjust the settings accordingly.
-
Please follow the official link and install the prerequisites(Rust, Android Studio, JDK).
-
Clone the repo
git clone --recursive https://github.com/nota-github/mlc_mobile_fm.git
cd mlc_mobile_fm
- Add environment variables at
~/.zshrc
export TVM_HOME="absolute/path/to/mlc_mobile_fm/3rdparty/tvm"
export PYTHONPATH=$TVM_HOME/python:${PYTHONPATH}
- Create virtual environment
conda env remove -n mlc-chat-venv -y
conda create -n mlc-chat-venv -c conda-forge \
"llvmdev==15" \
"cmake>=3.24" \
git \
python=3.11 -y
conda activate mlc-chat-venv
- Build TVM from scratch
cd 3rdparty/tvm
mkdir build && cd build
cp ../cmake/config.cmake .
echo "set(USE_LLVM \"llvm-config --ignore-libllvm --link-static\")" >> config.cmake
echo "set(HIDE_PRIVATE_SYMBOLS ON)" >> config.cmake
echo "set(CMAKE_BUILD_TYPE RelWithDebInfo)" >> config.cmake
echo "set(USE_METAL ON)" >> config.cmake
cmake .. && cmake --build . --parallel 4 && cd ../../../
For more detailed information about the options, please refer to the Option 2. Build from source - Details section of the provided link.
- Build MLC LLM module from scratch
mkdir build && cd build
python ../cmake/gen_cmake_config.py
cmake .. && cmake --build . --parallel 4 && cd ..
cd python
pip install -e .
cd ..
- Package the model for android
bash package_from_scratch.sh
- Open
Android
directory in Android Studio and run the app.