Skip to content

Commit

Permalink
add beam.cloud
Browse files Browse the repository at this point in the history
  • Loading branch information
offchan42 authored Oct 10, 2023
1 parent b7c8d11 commit a462740
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -216,10 +216,11 @@ Libraries that help you develop/debug/deploy the model in production (MLOps). Th

Remember that this is an opinionated list. There are bazillions of cloud providers out there. I'm not going to list them all. I'm just going to list the ones that I'm familiar with and I think are good.

* https://modal.com/ Modal lets you run or deploy machine learning models, massively parallel compute jobs, task queues, web apps, and much more, without your own infrastructure. Their service is so interesting [someone make a course about it](https://twitter.com/jim_dowling/status/1604464301015797760).
* https://modal.com/ Modal lets you run or deploy machine learning models, massively parallel compute jobs, task queues, web apps, and much more, without your own infrastructure.
* https://lambdalabs.com/ GPU cloud built for deep learning. Instant access to the best prices for cloud GPUs on the market. No commitments or negotiations required. Save over 73% vs AWS, Azure, and GCP. Configured for deep learning with Pytorch, TensorFlow, Jupyter
* https://www.runpod.io/ Save over 80% on GPUs. GPU rental made easy with Jupyter for PyTorch, Tensorflow or any other AI framework. I've used it before. Quite easy to use.
* https://www.banana.dev/ Ship ML to Prod, instantly. ⚡ Scale your machine learning inference and training on serverless GPUs. I heard someone reviewing that they have the cheapest GPU for inference service.
* https://www.beam.cloud/ On-demand GPU compute: Train and deploy AI and LLM applications securely on serverless GPUs, without managing infrastructure

### Data Storage

Expand Down

0 comments on commit a462740

Please sign in to comment.