Skip to content

shurick81/ai-labs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 

Repository files navigation

ai-labs

If you want to run ML tools and their prerequisites/dependencies in your experimentation environment, what would be the shortest path? Here we collected examples of experiments that you can relatively simply execute even without experience in even installing Python.

Inside an Artificial Neural Network

lab-contents/001_inside_an_artificial_neural_network.

First Machine Learning Experiments

Ways to Execute ML Training and Inference

First, let's go through some methods of executing the ML processes without preliminary installing prerequisites in your physical environment, like Laptop.

Using ML Cloud Providers

Machine learning cloud providers allow you using the most powerful models that might be quite impossible for you to run otherwise.

flowchart TB
  subgraph provider[ML Cloud Provider]
    service[ML Service]
  end
  laptop-->|Calling API|service
Loading

Using Docker

Use container images that already have such preinstalled software as Python, PyTorch, fastai, Pandas, Jupiter, etc.

flowchart TB
    container_image-->|saved to|docker_hub
    subgraph laptop[Your Laptop]
      persistend_files[Persistent Files]
      container[Disposable Container]-->|mounts|persistend_files
    end
    docker_hub[Docker Hub]
    container-->|pulled from|docker_hub
    subgraph container_image[Container Image]
      Python
      PyTorch
      fastai
      Pandas
      Jupiter
    end
Loading

Prerequisite for using this approach is Docker installed in Mac, Linux or WSL (Windows) environment.

Using Lab VMs

When running on a local docker takes too much resources or too much time, an option might be running the load in the cloud.

flowchart TB
  subgraph provider[Cloud Provider]
    VM[GPU-Accelerated VM]
  end
  laptop-->|remotely control|VM
Loading

Prerequisite for using this approach is having installed tools for remote control of Cloud provider such as Azure.

Problem Class Training/Inference Environement ML Toolset Experiment
LLM inference cloud Gemini 1.5 Section
LLM inference cloud Gemini 2.0 Section
LLM prompt with image cloud Gemini 2.0 Section
Tabular training and inference docker PyTorch, fastai Section
Tabular training and inference docker PyTorch, fastai, Jupiter Section
visual training and inference docker PyTorch Lightning, Jupiter Page
visual training and inference cloud VM PyTorch Lightning, Jupiter Page
visual training and inference cloud VM PyTorch Lightning, CLI Page

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published