見に /mi·ni/ — with intent to see 1
This is a template repository for doing AI research. Features:
- Local Python notebooks
- Remote per-function GPU compute 2
- Inline visualization with remote-to-local callbacks
- AI-assisted coding with Copilot/VS Code
Above: screen recording of a local notebook running a remote training job 3. a:
track
is a function that runs locally — even when called from the remote function. a': The plot is displayed directly in the notebook, showing training metrics in real time. b:train
is a function that runs in the cloud (with a GPU). b': The message "Training complete" is printed remotely, but the output is shown locally (no callback needed). c: Awith
statement creates a context that bridges the remote and local environments.
Read about how it works in doc/hither-thither.md.
Code for the above demo
The code shown in the screen recording is:
@run.hither
async def track(loss: float):
history.append(loss)
plot(history)
@run.thither(gpu='L4')
async def train(epochs: int, track):
for _ in range(epochs):
track(some_training_function())
print('Training complete')
async with run(), track as callback:
await train(25, callback)
More cool features
- Dev container for a consistent environment, both locally and in Codespaces
- ML stack (PyTorch, Polars, etc.)
- Modern package management with uv
- Pre-configured for good engineering practices: tests, linting, type-checking (optional!)
First, open in GitHub Codespaces. Then:
./go install cpu # CPU deps for local venv
./go auth # Authenticate with Modal for remote compute
Open the Getting Started notebook and try it out (choose .venv/bin/python3
as the kernel). For a more complete example, have a look at the nanoGPT notebook.
Virtual environment
The Python environment is configured when the dev container is created.
Use uv to add and remove packages, and to run scripts:
uv add plotly --group local
uv run python example.py
Restarting the language server (VS Code)
If you open a Python file before the setup is complete, you may need to restart the Python language server.
- Open a
.py
or.ipynb
file - Open the command pallette with ⇧⌘P or CtrlShiftP
- Run Python: Restart Language Server.
This project is dedicated to the public domain 45. In your own experiments, there's no need to contribute back! The code is yours to modify as you please.
If you do want to contribute to this template, then fork it as usual. Before making a pull request, run:
./go check
Footnotes
-
From 見に行く (mi-ni iku), meaning "to go for the purpose of seeing something." This library is about small AI experiments—quick, lightweight explorations to try and see what happens. ↩
-
Modal is used for remote compute. They charge per-second, billed for the duration of your function. ↩
-
The recording was edited: 1. labels were added; 2. the remote
train()
function was moved to the right so that the video wouldn't take up so much vertical space. ↩ -
Technically, the licence is the Unlicense, which is about as close as you can get to "do whatever you want". ↩
-
Exception: Code in
src/experiment
is derived from nanoGPT by Andrej Karpathy and is subject to MIT license terms. See the LICENSE file for details. ↩