CrowdLlama is a distributed system that leverages the open-source Ollama project to run LLM inference tasks across multiple nodes using peer-to-peer (P2P) networking, enabling collaborative large language model inference workloads.
CrowdLlama enables distributed AI computing by allowing users to share their computational resources and access distributed AI capabilities through a peer-to-peer network. The system uses a DHT (Distributed Hash Table) for peer discovery and coordination.
The core distributed system implementation featuring:
- DHT-based peer discovery for network coordination
- Worker nodes that advertise GPU capabilities and supported models
- Simple metadata protocol for querying worker information
- Consumer components (planned) for distributed task execution
Built with Go, this repository contains the foundational P2P networking infrastructure that powers the entire CrowdLlama ecosystem.
Terraform-based infrastructure automation for deploying CrowdLlama DHT servers on Linode cloud infrastructure:
- Automated deployment with Docker and Docker Compose
- Auto-updating containers via Watchtower integration
- Production-ready configuration with firewall rules and systemd services
- Multi-region support across Linode's global infrastructure
- GitHub Actions integration for CI/CD pipeline
This repository enables scalable, reliable deployment of CrowdLlama infrastructure components.
Cross-platform desktop application built with Electron and React, providing:
- User-friendly interface for sharing computational resources
- Chat interface for interacting with AI models
- Network status monitoring with real-time peer count
- Model selection for contributing different AI models (llava, mistral, codellama, phi, gemma)
- Modern UI with Tailwind CSS and responsive design
The desktop app makes it easy for users to participate in the CrowdLlama network and access distributed AI capabilities.
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Desktop App │ │ Worker Node │ │ DHT Server │
│ (Electron) │◄──►│ (Go/Ollama) │◄──►│ (Go) │
│ │ │ │ │ │
│ • Chat UI │ │ • GPU Resources │ │ • Peer Discovery│
│ • Network Status│ │ • Model Serving │ │ • Coordination │
│ • Resource Share│ │ • P2P Network │ │ • Metadata │
└─────────────────┘ └─────────────────┘ └─────────────────┘
- Deploy Infrastructure: Use the infra repository to set up DHT servers
- Run Workers: Build and run worker nodes using the crowdllama repository
- Use Desktop App: Download and run the desktop application
We welcome contributions! Each repository has its own contribution guidelines:
- crowdllama - Core distributed system
- infra - Infrastructure automation
- desktop - Desktop application
All CrowdLlama projects are licensed under the MIT License. See individual repositories for license details.
- Main Repository: crowdllama/crowdllama
- Infrastructure: crowdllama/infra
- Desktop App: crowdllama/desktop
- Ollama Project: ollama/ollama
CrowdLlama - Distributed AI Computing for Everyone