-
Notifications
You must be signed in to change notification settings - Fork 0
Home
This is the back-end part of a generic boilerplate for basic ML/DL projects. It has the basic functionality for doing one-off predictions based on simple inputs (i.e. not on images for example, or large numbers of inputs).
It contains nothing more than all the configuration files for the project. No actual code.
- FastAPI serving the back end
- Model storage on GCS / MLFlow (python functions to be written)
- Everything to deploy with Docker
Hit the green Use this template
button at the top of the template repo, or follow this link, to create a new repository.
Once created, clone your new repo to your (and your teammates') machines with the green <> Code
button and your favourite method (probably GitHub CLI).
-
packagename/
: your package (rename this, and adapt the configuration files - tip: Ctrl-Shift-F or Cmd-Shift-F in VS Code) -
api/
: this is where your API code goes -
models/
: your saved models (not tracked by git, should be stored elsewhere: GCS, MLFlow, ...) -
raw_data/
: your data (not tracked by git, should be stored elsewhere for cloud: GCS, MLFLow, ...) -
notebooks/
: your notebooks, tracked by git, but avoid working with different people on one notebook (include your name in the filename)
Copy .env.sample
and .env.yaml.sample
to new files .env
and .env.yaml
and update all variables with your project identifiers.
Requirements.txt:
-
requirements.txt
: for production excluding all packages that are redundant in prod -
requirements_dev.txt
: adding packages for local usage including ipython, jupyter notebook, debugging, matplotlib, etc. - Update
requirements.txt
if you use scikit-learn and/or TensorFlow (uncomment the respective lines).
Make sure you renamed the packagename
folder and all references to it.
Create a new virtual environment, and install all packages:
pyenv virtualenv 3.10.6 <your-project-name>
pyenv local <your-project-name>
pip install -r requirements.txt
pip install -r requirements_dev.txt
When finished, run make test_structure
, and check any errors or warnings.
All commands to deploy using Docker and GCP have been included in the Makefiles:
-
make run_api
to run the API locally -
make docker_#####
to run Docker stuff:- Commands to build and run (interactively) a local Docker image
- Commands to build and run (interactively) a Docker image ready for GCP (i.e. using linux/amd64 infra)
- Commands to push and deploy to GCP
Just type make <tab>
on the command line to list all possibilities