Skip to content

Chat to deploy and manage applications on any infrastructure

License

Notifications You must be signed in to change notification settings

Tribe-Studios/appilot

 
 

Repository files navigation

Appilot

Appilot['æpaɪlət] stands for application-pilot. It is an experimental project that helps you operate applications using GPT-like LLMs.

Feature

  • Application management: deploy, upgrade, rollback, etc.
  • Environment management: clone, view topology, etc.
  • Diagnose: view logs, find flaws and provide fixes.
  • Safeguard: any action involving state changes requires human approval.
  • Hybrid infrastructure: works on kubernetes, VM, cloud, on-prem.
  • Multi language support: Choose the natural language you're comfortable with.
  • Pluggable backends: It supports multiple backends including Walrus and Kubernetes, and is extensible.

Demo

Chat to deploy llama-2 on AWS:

appilot-llama2.mov

Other use cases:

Quickstart

prerequistes:

  • Get OpenAI API key with access to the gpt-4 model.
  • Install python3 and make.
  • Install kubectl and helm.
  • Have a running Kubernetes cluster.
  1. Clone the repository.
git clone https://github.com/seal-io/appilot && cd appilot
  1. Run the following command to get the envfile.
cp .env.example .env
  1. Edit the .env file and fill in OPENAI_API_KEY.

  2. Run the following command to install. It will create a venv and install required dependencies.

make install
  1. Run the following command to get started:
make run
  1. Ask Appilot to deploy an application, e.g.:
> Deploy a jupyterhub.
...
> Get url of the jupyterhub.

Usage

Configuration

Appilot is configurable via environment variable or the envfile:

Parameter Description Default
OPENAI_API_KEY OpenAI API key, access to gpt-4 model is required. ""
OPENAI_API_BASE Custom openAI API base. You can integrate with other LLMs as long as they serve in the same API style. ""
TOOLKITS Toolkits to enable. Currently support Kubernetes and Walrus. Case insensitive. "kubernetes"
NATURAL_LANGUAGE Natural language AI used to interacte with you. e.g., Chinese, Japanese, etc. "English"
SHOW_REASONING Show AI reasoning steps. True
VERBOSE Output in verbose mode. False
WALRUS_URL URL of Walrus, valid when Walrus toolkit is enabled. ""
WALRUS_API_KEY API key of Walrus, valid when Walrus toolkit is enabled. ""
WALRUS_SKIP_TLS_VERIFY Skip TLS verification for WALRUS API. Use when testing with self-signed certificates. Valid when Walrus toolkit is enabled. True
WALRUS_DEFAULT_PROJECT Project name for the default context, valid when Walrus toolkit is enabled. ""
WALRUS_DEFAULT_ENVIRONMENT Environment name for the default context, valid when Walrus toolkit is enabled. ""

Using Kubernetes Backend

Follow steps in quickstart to run with Kubernetes backend.

Using Walrus Backend

Prerequisites: Install Walrus.

Walrus serves as the application management engine. It provides features like hybrid infrastructure support, environment management, etc. To enable Walrus backend, edit the envfile:

  1. Set TOOLKITS=walrus
  2. Fill in OPENAI_API_KEY, WALRUS_URL and WALRUS_API_KEY

Then you can run Appilot to get started:

make run

Run with Docker

You can run Appilot in docker container when using Walrus backend.

Prerequisites: Install docker.

  1. Get an envfile by running the following command.
cp .env.example .env
  1. Configure the .env file.
  • Set TOOLKITS=walrus
  • Fill in OPENAI_API_KEY, WALRUS_URL and WALRUS_API_KEY
  1. Run the following command:
docker run -it --env-file .env sealio/appilot:main

Using LLM alternatives to GPT-4

You can use other LLMs as the reasoning engine of Appilot, as long as it serves inference APIs in openAI compatible way.

  1. Configure the .env file, then set OPENAI_API_BASE=https://your-api-base.

  2. Run Appilot as normal.

How it works

The following is the architecture diagram of Appilot:

appilot-arch

License

Copyright (c) 2023 Seal, Inc.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at LICENSE file for details.

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

About

Chat to deploy and manage applications on any infrastructure

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.0%
  • Makefile 1.9%
  • Dockerfile 0.1%