ml-serving
Here are 12 public repositories matching this topic...
Collection of OSS models that are containerized into a serving container
-
Updated
Sep 19, 2023 - Python
Serving large ml models independently and asynchronously via message queue and kv-storage for communication with other services [EXPERIMENT]
-
Updated
Jul 20, 2021 - Python
Resources for serving models in production
-
Updated
Sep 25, 2019 - Python
Applied Machine Learning Projects
-
Updated
Feb 3, 2020 - Jupyter Notebook
Heterogeneous System ML Pipeline Scheduling Framework with Triton Inference Server as Backend
-
Updated
Jun 25, 2023 - Python
Example solution to the MLOps Case Study covering both online and batch processing.
-
Updated
Jul 2, 2024 - Pkl
A curated list of awesome open source and commercial platforms for serving models in production 🚀
-
Updated
Apr 20, 2022
🌐 Language identification for Scandinavian languages
-
Updated
Jan 5, 2021 - Python
Big ML Project with infrastructure (MLflow, Minio, Grafana), backend (FastAPI, Catboost) and frontend (React, Maplibre)
-
Updated
Jun 26, 2024 - Python
Miscellaneous codes and writings for MLOps
-
Updated
Jul 3, 2024 - Jupyter Notebook
Integrating Aporia ML model monitoring into a Bodywork serving pipeline.
-
Updated
Jun 20, 2022 - Jupyter Notebook
Improve this page
Add a description, image, and links to the ml-serving topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the ml-serving topic, visit your repo's landing page and select "manage topics."