[Join Discord Server] [Try it]
Note
If you don't know what Project AIRI is, here's the introduction:
Project AIRI aimed to build the ultimate virtual AI lives within the cyber spaces, we are currently targeting to re-create Neuro-sama.
Funding? Startup?
Currently, we have several parnership projects with other companies to build similar products, but I (@nekomeowww) primarily self sponsored those most contributed members of both @proj-airi and @moeru-ai, for supporting their work. I'd love have a opportunity to work on this project in long term without worrying the funding. If you are, and indeed got interested in this project, please contact me.
Who are we?
We are a group of currently non-funded talented people made up with computer scientists, experts in multi-modal fields, designers, product managers, and popular open source contributors who loves the goal of where we are heading now.
We wish a better vritual humanoid could live with us, here we are, aimming to build a character interactive system like the one powers up the famous AI VTuber Neuro-sama.
github.com/moeru-ai/airi our longest adventure on AGI in virtual world.
unspeech
: Universal endpoint proxy server for/audio/transcriptions
and/audio/speech
, like LiteLLM but for any ASR and TTShfup
: tools to help on deploying, bundling to HuggingFace Spaces@proj-airi/drizzle-duckdb-wasm
: Drizzle ORM driver for DuckDB WASM@proj-airi/duckdb-wasm
: Easy to use wrapper for@duckdb/duckdb-wasm
@proj-airi/lobe-icons
: Iconify JSON bundle for amazing AI & LLM icons from lobe-icons, support Tailwind and UnoCSS- Airi Factorio: Allow Airi to play Factorio
- Factorio RCON API: RESTful API wrapper for Factorio headless server console
autorio
: Factorio automation librarytstl-plugin-reload-factorio-mod
: Reload Factorio mod when developing- 🥺 SAD: Documentation and notes for self-host and browser running LLMs
@velin-dev/ml
: Use Vue SFC and Markdown to write easy to manage stateful prompts for LLMdemodel
: Easily boost the speed of pulling your models and datasets from various of inference runtimes.inventory
: Centralized model catalog and default provider configurations backend service- MCP Launcher: Easy to use MCP builder & launcher for all possible MCP servers, just like Ollama for models!
Our best partner @moeru-ai, many side projects were born there.