Skip to content
/ Milo Public

Your friendly workspace buddy for Miles on Slack.

License

Notifications You must be signed in to change notification settings

miles-no/Milo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 Milo

Your friendly workspace buddy for Miles on Slack. Milo uses local AI models through Ollama to provide helpful responses while keeping your data private.

Features

  • 💬 Chat with Milo by mentioning @milo
  • 🔄 Switch between different AI models on the fly
  • 🚀 Fast responses using local models
  • 🔒 Privacy-focused: all processing happens locally

Commands

Just mention @milo followed by your command:

  • @milo help - Show available commands
  • @milo list models - Show available AI models
  • @milo use model <name> - Switch to a different model
  • @milo reset model - Reset to the default model (llama3.2:1b)

Setup

  1. Install dependencies:

    npm install
  2. Install Ollama:

    # macOS
    brew install ollama
  3. Pull the default model:

    ollama pull llama3.2:1b
  4. Create a .env file:

    SLACK_BOT_TOKEN=xoxb-your-bot-token
    SLACK_SIGNING_SECRET=your-signing-secret
    SLACK_APP_TOKEN=xapp-your-app-token
  5. Start Ollama:

    ollama serve
  6. Start Milo:

    npm start

Available Models

Milo uses Ollama models. Install additional models with:

ollama pull <model-name>

Some recommended models:

  • llama3.2:1b (default)
  • codellama
  • mistral
  • neural-chat

Development

Built with:

  • TypeScript
  • Slack Bolt Framework
  • Ollama API

Privacy & Security

Milo processes all queries locally using Ollama, ensuring your conversations stay private and secure.

Contributing

Issues and pull requests are welcome! Feel free to contribute to make Milo even better.

About

Your friendly workspace buddy for Miles on Slack.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published