diff --git a/docs/.vitepress/config.mts b/docs/.vitepress/config.mts index 9fde09f2..286dce8f 100644 --- a/docs/.vitepress/config.mts +++ b/docs/.vitepress/config.mts @@ -84,6 +84,7 @@ export default defineConfig({ { text: 'Overview', link: '/docs/overview' }, { text: 'Get Started', link: '/docs/get-started' }, { text: 'Next Steps', link: '/docs/next-steps' }, + { text: 'Deploy ModelKits', link: '/docs/deploy' }, { text: 'Kit Dev', link: '/docs/dev-mode' }, { text: 'Why KitOps?', link: '/docs/why-kitops' }, { text: 'How it is Used', link: '/docs/use-cases' }, diff --git a/docs/src/docs/deploy.md b/docs/src/docs/deploy.md new file mode 100644 index 00000000..275f09b7 --- /dev/null +++ b/docs/src/docs/deploy.md @@ -0,0 +1,112 @@ +# Deploying ModelKits + +This page outlines how to use `init` or Kit CLI containers to deploy a ModelKit-packaged AI/ML project to Kubernetes or any other container runtime. The KitOps repo provides pre-built ModelKits that can be used for both semi-turnkey solutions, and more DIY options. + +## Pre-built Containers + +There are currently two pre-built containers: + +1. Init container: https://github.com/jozu-ai/kitops/blob/main/build/dockerfiles/init/README.md +1. Kit CLI container: https://github.com/jozu-ai/kitops/blob/main/build/dockerfiles/README.md + +## Init Container + +The init container unpacks the model reference from a ModelKit to a specific path and then exits. This makes it useful as a Kubernetes `init` container. This container also supports verifying signatures for containers automatically from key-based or keyless signers. + +The container is configurable via environment variables: + +`$MODELKIT_REF`: The ModelKit to pull (required). +`$UNPACK_PATH`: Where to unpack the ModelKit (normally you’d want a `volumeMount` here). This is required and will default to `/home/user/modelkit`. +`$UNPACK_FILTER`: Optional filter to limit what is unpacked (e.g., just the model, or model + code). The filter format is the same as the [--filter command line argument](./cli/cli-reference.md) for the Kit CLI. +`$COSIGN_KEY`: Path to the key that should be used for verification, mounted inside the init container (e.g., from a Kubernetes secret). +`$COSIGN_CERT_IDENTITY`: Signing identity for keyless signing. +`$COSIGN_CERT_OIDC_ISSUER`: OIDC issuer for keyless signer identity. + +### Example Kubernetes YAML + +``` + apiVersion: v1 + kind: Pod + metadata: + name: my-modelkit-pod + spec: + containers: + - name: model-server + image: "" # Some container that expects your modelkit + # Share a volumeMount between the init container and this one + volumeMounts: + - name: modelkit-storage + mountPath: /my-modelkit + + # Run the init container to unpack the ModelKit into the volume mount and make + # it available to the main container + initContainers: + - name: kitops-init + image: ghcr.io/jozu-ai/kitops-init-container:latest + env: + - name: MODELKIT_REF + value: "ghcr.io/jozu-ai/my-modelkit:latest" + - name: UNPACK_PATH + value: /tmp/my-modelkit + volumeMounts: + - name: modelkit-storage + mountPath: /tmp/my-modelkit + + # Define a volume to store the ModelKit + volumes: + - name: modelkit-storage + emptyDir: {} +``` + +## Using the Kit CLI Container + +The containerized Kit CLI can be used to tailor the running of a ModelKit because you can run any Kit CLI command. This gives you flexibility, but more manual work (the world is your oyster, but it may be hard to shuck). + +This container runs `kit` as its entrypoint, accepting Kit CLI arguments. So you could run the container instead of downloading and installing the Kit CLI, although you’ll need to mount a docker volume. + +Docker run example: + +`docker run ghcr.io/jozu-ai/kit:latest pull jozu.ml/jozu/llama3-8b:8B-instruct-q5_0` + +Kubernetes example: + +``` + apiVersion: v1 + kind: Pod + metadata: + name: my-modelkit-pod + spec: + containers: + - name: me-using-kit + image: ghcr.io/jozu-ai/kit:latest + args: # You can put whatever you want; args is an array + - pull + - jozu.ml/jozu/llama3-8b:8B-instruct-q5_0 +``` + +## Creating a Custom Container + +Going a step further you can use the Kit CLI container to create your own bespoke ModelKit containers. + +Example `dockerfile` for a custom container that has built into it: + +``` + # Staged build to grab the ModelKit so we can use it later + FROM ghcr.io/jozu-ai/kit:latest AS modelkit-download + + # Download your ModelKit into the container + RUN kit unpack /tmp/my-modelkit + + # Actual build stage; this just uses Alpine but you would build whatever + # container you need here + FROM alpine:latest + + # Normal container build + setup -- depends on your use case + # ... + # ... + + # Copy the downloaded ModelKit into this container + COPY --from=modelkit-download /tmp/my-modelkit /home/user/modelkit-data +``` + +**Questions or suggestions?** Drop an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy) to get support or share your feedback. \ No newline at end of file diff --git a/docs/src/docs/dev-mode.md b/docs/src/docs/dev-mode.md index d3db82c2..7b96ad0f 100644 --- a/docs/src/docs/dev-mode.md +++ b/docs/src/docs/dev-mode.md @@ -39,3 +39,5 @@ When you're done don't forget to stop the Kit dev server: ```sh kit dev stop ``` + +**Questions or suggestions?** Drop an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy) to get support or share your feedback. \ No newline at end of file diff --git a/docs/src/docs/get-started.md b/docs/src/docs/get-started.md index bc0ba4ad..6667de13 100644 --- a/docs/src/docs/get-started.md +++ b/docs/src/docs/get-started.md @@ -8,6 +8,7 @@ In this guide, we'll use ModelKits and the kit CLI to easily: * Package up a model, notebook, and datasets into a single ModelKit you can share through your existing tools * Push that versioned ModelKit package to a registry * Grab only the assets you need from the ModelKit for testing, integration, local running, or deployment +* Package the ModelKit as a container or Kubernetes deployment ## Before we start... @@ -121,6 +122,10 @@ kit push jozu.ml/brad/quick-start:latest Note that some registries, like Jozu Hub, don't automatically create a repository. If you receive an error from your `push` command, make sure you have created the repository in your target registry and that you have push rights to the repository. +### ModelKit to Container or Kubernetes + +You can build a container or Kubernetes deployment that pulls artifacts directly from the ModelKit. This makes automating container creation and Kubernetes deployment simple. Read more in our [deployment documentation](./deploy.md). + ### Congratulations You've learned how to unpack a ModelKit, pack one up, and push it. Anyone with access to your remote repository can now pull your new ModelKit and start playing with your model using the `kit pull` or `kit unpack` commands. @@ -134,4 +139,4 @@ If you'd like to learn more about using Kit, try our [Next Steps with Kit](./nex Or, if you want to run an LLM-based ModelKit locally try our [dev mode](./dev-mode.md) -Thanks for taking some time to play with Kit. We'd love to hear what you think. Feel free to drop us an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/3eDb4yAN). +Thanks for taking some time to play with Kit. We'd love to hear what you think. Feel free to drop us an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy). diff --git a/docs/src/docs/kitfile/kf-overview.md b/docs/src/docs/kitfile/kf-overview.md index 179bc1ce..79bbe8e9 100644 --- a/docs/src/docs/kitfile/kf-overview.md +++ b/docs/src/docs/kitfile/kf-overview.md @@ -88,4 +88,6 @@ datasets: - description: validation data (tabular) name: validation data path: ./data/test.csv -``` \ No newline at end of file +``` + +**Questions or suggestions?** Drop an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy) to get support or share your feedback. \ No newline at end of file diff --git a/docs/src/docs/modelkit/intro.md b/docs/src/docs/modelkit/intro.md index 8a50864a..f8e05c84 100644 --- a/docs/src/docs/modelkit/intro.md +++ b/docs/src/docs/modelkit/intro.md @@ -21,3 +21,5 @@ ModelKit revolutionizes the way AI/ML artifacts are shared and managed throughou **Optimized for AI/ML Workflows:** ModelKits are tailor-made for AI/ML projects, addressing specific needs such as versioning and environment configuration. ModelKit is not just a packaging format; it's a building block for innovation, simplifying the complexities of AI/ML development and deployment. By adopting ModelKit, teams can focus more on creating value and less on managing the intricacies of artifact storage and sharing. + +**Questions or suggestions?** Drop an [issue in our GitHub repository](https://github.com/jozu-ai/kitops/issues) or join [our Discord server](https://discord.gg/Tapeh8agYy) to get support or share your feedback. \ No newline at end of file diff --git a/docs/src/docs/overview.md b/docs/src/docs/overview.md index 170cf100..79ca9310 100644 --- a/docs/src/docs/overview.md +++ b/docs/src/docs/overview.md @@ -6,7 +6,7 @@ KitOps is an innovative open-source project designed to enhance collaboration am ### 🎁 ModelKit -At the heart of KitOps is the ModelKit, an OCI-compliant packaging format that enables the seamless sharing of all necessary artifacts involved in the AI/ML model lifecycle. This includes datasets, code, configurations, and the models themselves. By standardizing the way these components are packaged, ModelKit facilitates a more streamlined and collaborative development process that is compatible with nearly any tool. +At the heart of KitOps is the ModelKit, an OCI-compliant packaging format that enables the seamless sharing of all necessary artifacts involved in the AI/ML model lifecycle. This includes datasets, code, configurations, and the models themselves. By standardizing the way these components are packaged, ModelKit facilitates a more streamlined and collaborative development process that is compatible with nearly any tool. You can even [deploy ModelKits to containers or Kubernetes](./deploy.md). ### 📄 Kitfile @@ -37,15 +37,19 @@ KitOps enables you to innovate in AI/ML without the usual infrastructure distrac KitOps is not just another tool; it's a comprehensive CLI and packaging system specifically designed for the AI/ML workflow. It acknowledges the nuanced needs of AI/ML projects, such as: ### 📊 Management of Unstructured Datasets + AI/ML projects often deal with large, unstructured datasets, such as images, videos, and audio files. KitOps simplifies the versioning and sharing of these datasets, making them as manageable as traditional code. ### 🤝 Synchronized Data and Code Versioning + One of the core strengths of KitOps is its ability to keep data and code versions in sync. This crucial feature solves the reproducibility issues that frequently arise in AI/ML development, ensuring consistency and reliability across project stages. ### 🚀 Deployment Ready -Designed with a focus on deployment, ModelKits package assets in standard formats so they're compatible with nearly any tool - helping you get your model to production faster and more efficiently. + +Designed with a focus on deployment, ModelKits package assets in standard formats so you can depoloy them as [containers or to Kubernetes](./deploy.md). They're also [compatible with nearly any tool](./modelkit/compatibility.md) - helping you get your model to production faster and more efficiently. ### 🏭 Standards-Based Approach + KitOps champions openness and interoperability through its core components, ensuring seamless integration into your existing workflows: ModelKits are designed as OCI (Open Container Initiative) artifacts, making them fully compatible with the Docker image registries and other OCI-compliant storage solutions you already use. This compatibility allows for an easy and familiar integration process. By adhering to widely accepted standards, KitOps ensures you're not tied to a single vendor or platform. This flexibility gives you the freedom to choose the best tools and services for your needs without being restricted by proprietary formats.