Skip to content

A browser extension that brings local AI to your device

License

Notifications You must be signed in to change notification settings

axonzeta/aibrow

Repository files navigation

AiBrow

Run local AI in your browser with the unified AiBrow JavaScript API. The API allows you to make the best use of the devices hardware to run local AI in the browser in the most performant way possible. It's based around the Chrome built-in AI APIs, but adds support for new features such as custom/HuggingFace models, grammar schemas, JSON output, LoRa Adapters, embeddings, and a fallback to a self-hosted or public server for lower powered devices.

Engine Targets Custom models HuggingFace models Runs on-device Grammar output LoRA Adapters Embeddings GPU Required Performance
Chrome AI Chrome Desktop ⭐️⭐️
llama.cpp Desktop Browsers ⭐️⭐️⭐️
WebGPU Desktop & Android Browsers ⭐️
Self-hosted Server Any browser ⭐️⭐️⭐️
Public Server Any browser ⭐️⭐️

When using AiBrow through the llama.cpp extension or WebGPU, it runs small language models completely on your machine, making it fast and keeping your data private. You can complete sentences, improve your writing, rephrase text, or sum up complex information using the latest small LLMs such as Llama 3.2, Phi 3.5, Gemma 2 and Qwen 2.5.

The AiBrow API follows the current proposal for the browser Prompt API in development in Google Chrome.

Quick Start

Install the dependencies:

npm install @aibrow/web

You can use the languageModel API to have a conversation with the AI, using whichever backend you choose.

import AI from '@aibrow/web'

// WebGPU
const webGpu = await AI.web.languageModel.create()
console.log(await webGpu.prompt('Write a short poem about the weather'))

// Llama.cpp
const ext = await AI.aibrow.languageModel.create()
console.log(await ext.prompt('Write a short poem about the weather'))

// Chrome AI
const browser = await AI.browser.languageModel.create()
console.log(await browser.prompt('Write a short poem about the weather'))

// Server
const server = await AI.server.languageModel.create()
console.log(await server.prompt('Write a short poem about the weather'))

AiBrow extension using llama.cpp natively

Get AiBrow for Chrome Get AiBrow for Firefox

Step by step install

Using the AiBrow extension gives the best on-device performance with the broadest feature-set. It's a browser extension that leverages the powerful llama.cpp and can give great performance on all kinds of desktop computers either leveraging the GPU or CPU. Downloaded models are stored in a common repository meaning models only need to be downloaded once. You can use models provided by AiBrow, or any GGUF model hosted on HuggingFace.

Getting started
import AI from '@aibrow/web'

const { ready, extension, helper } = await AI.aibrow.capabilities()
if (ready) {
  const session = await AI.aibrow.languageModel.create()
  console.log(await session.prompt('Write a short poem about the weather'))
} else {
  // Here are some tips to help users install the AiBrow extension & helper https://docs.aibrow.ai/guides/helping-users-install-aibrow
  console.log(`Extension is not fully installed. Extension=${extension}. Helper=${helper}`)
}
Additional info When the extension is installed, it's directly usable on any page via `window.aibrow` or if installed on a browser other than Google Chrome it automatically polyfills `window.ai`.
Typescript types

Types for window.aibrow can be added to your project by using the npm install --save-dev @aibrow/dom-types package. Then to expose them, place the following either in your global.d.ts or the entry point to your code

import type AI from "@aibrow/dom-types"

declare global {
  interface Window {
    readonly aibrow: typeof AI;
  }
}
Other extensions

Other extensions can make use of the AiBrow extension by using the extension library library using npm install @aibrow/extension

import aibrow from '@aibrow/extension'

const { helper, extension } = await window.aibrow.capabilities()
if (extension) {
  if (helper) {
    const session = await window.aibrow.languageModel.create()
    const stream = await sess.promptStreaming('Write a poem about AI in the browser')
    for await (const chunk of stream) {
      console.log(chunk)
    }
  } else {
    console.log('Aibrow helper not installed')
  }
} else {
  console.log('Aibrow not installed')
}

AiBrow on WebGPU

WebGPU provides a good middle-ground for performance and feature set, but it comes with some memory usage restrictions and performance overheads. If you only need to use small models or want to provide a fallback for when the extension isn't installed this can provide a great solution. Under the hood it uses transformers.js from HuggingFace. Models are downloaded through an AiBrow frame which means models only need to be downloaded once. You can use models provided by AiBrow, or any ONNX model hosted on HuggingFace.

Getting started
import AI from '@aibrow/web'

const session = await AI.web.languageModel.create()
console.log(await session.prompt('Write a short poem about the weather'))

Chrome built-in AI

The Chrome built-in AI is a great option for simple tasks such as summarization, writing etc. It has a smaller feature-set compared to the AiBrow extension and WebGPU and has reasonable on-device performance.

Getting started
import AI from '@aibrow/web'

if (AI.browser) {
  const session = await AI.browser.languageModel.create()
  console.log(await session.prompt('Write a short poem about the weather'))
} else {
  console.log(`Your browser doesn't support browser window.ai`)
}

How can I build AiBrow myself?

  1. npm run watch or npm start to build the extension
  2. Run out/native/boot.cjs --install to install the native manifest hooks
  3. Install the extension from out/extension into the browser
  4. Logs from the native binary are available in out/native/aibrow.log

Examples

  1. AiBrow Playground
  2. AiBrow demo extension