Skip to content

Chat locally with LLama 3 natively on Apple Silicon using MLX Framework.

Notifications You must be signed in to change notification settings

ricard-inho/local_LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Local LLM

LLama 3 running on Apple Silicon with MLX

Backend

Installation

conda create --name backend
conda install --yes --file backend/requirements.txt
conda install pytorch::pytorch torchvision torchaudio -c pytorch
conda install -c conda-forge mlx-lm

Run

uvicorn backend.app.main:app --host 0.0.0.0 --port 8000

Frontend

Installation

npx create-react-app app

Development mode

cd app
npm start

Production

cd app
npm run build

About

Chat locally with LLama 3 natively on Apple Silicon using MLX Framework.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published