Skip to content

LocalAI is an AI-powered chatbot that runs locally on your computer, providing a personalized AI experience without the need for internet connectivity. It utilizes a massive neural network with 60 billion parameters, making it one of the most powerful chatbots available.

Notifications You must be signed in to change notification settings

Jaseunda/local-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalAI - Your Personal AI on your Computer

LocalAI is an AI-powered chatbot that runs locally on your computer, providing a personalized AI experience without the need for internet connectivity. It utilizes a massive neural network with 60 billion parameters, making it one of the most powerful chatbots available. The public version of LocalAI currently utilizes a 13 billion parameter model. With its availability on Mac, Windows, Linux, and Android platforms, LocalAI is your personal AI assistant that can help you with a wide range of tasks.

Join Discord server here.

How to Get LocalAI Running on Your Computer

Alpaca and Llama are AI models that can be installed on your computer for natural language processing. Follow the instructions below to get started.

localAI

System Requirements

  • Operating System: Linux, Mac, or Windows
  • Memory: minimum 4GB (for 7B model) up to 40.88GB (for 65B model)
  • Disk Space: minimum 4.21GB (for 7B model) up to 432.64GB (for 65B model)

Installation Instructions

  1. Install Node.js
  2. Install Llama and/or Alpaca Models
    • Alpaca Models: To install the 7B Alpaca model, run the following command:
      npx dalai alpaca install 7B
      
    • Llama Models: To install the Llama models, run the following command:
      npx dalai llama install [MODEL NAME]
      
      Replace [MODEL NAME] with the name of the model you want to install. The available options are 7B, 13B, 30B, and 65B.
  3. Download the compatible build on your device and run it

Troubleshooting

  • Node.js not installed: If Node.js is not installed, download and install the latest version of Node.js from https://nodejs.org/en/download/
  • Installation failed: If the installation fails, make sure you have the necessary dependencies installed. For Windows users, install Visual Studio and make sure to check the "Python development," "Node.js development," and "Desktop development with C++" options during installation. For Mac users, install the necessary dependencies using Homebrew.
  • Powershell issue on Windows: On Windows, make sure to run all commands in cmd instead of Powershell, as Powershell may cause the script to fail silently.

Using LocalAI

To use LocalAI, follow these instructions:

  1. Install Python 3 on your computer.
  2. Clone the LocalAI repository from GitHub.
  3. Install the required Python modules by running the following command in the terminal: pip install termcolor json.
  4. Set up the JSON configuration file with your preferred settings:
    {
        "seed": -1,
        "threads": 4,
        "n_predict": 1,
        "model_path": "Your model location",
        "top_k": 40,
        "top_p": 0.9,
        "temp": 0.1,
        "repeat_last_n": 64,
        "repeat_penalty": 1.3
    }
    
  5. Run the LocalAI file to start LocalAI.
  6. Type in your question or request and LocalAI will do its best to help you.
  7. To end the conversation, type exit.
  8. To change your display name, type /change username <new_name>.

To configure LocalAI, you can modify the config.json file. Here's an example configuration file:

{
    "seed": -1,
    "threads": 4,
    "n_predict": 1,
    "model_path": "Your model location",
    "top_k": 40,
    "top_p": 0.9,
    "temp": 0.1,
    "repeat_last_n": 64,
    "repeat_penalty": 1.3
}

Feel free to adjust the values to your liking. Note that the model_path field should contain the path to the downloaded 13 billion parameter model.

For more information and updates, follow LocalAI on Github You can also join the Discord server here.

Contribution

Contributions are always welcome! If you'd like to contribute to LocalAI, feel free to fork the repository and submit a pull request. Please ensure that your code follows the project's coding standards and passes all tests before submitting a pull request.

License

LocalAI is licensed under the MIT License. See the LICENSE file for more information.

About

LocalAI is an AI-powered chatbot that runs locally on your computer, providing a personalized AI experience without the need for internet connectivity. It utilizes a massive neural network with 60 billion parameters, making it one of the most powerful chatbots available.

Resources

Stars

Watchers

Forks

Packages

No packages published