Skip to content

robotical/talk-with-marty

Repository files navigation

talk-with-marty-lib

A small TypeScript library that provides an abstraction layer for interacting with a virtual robot "Marty" using speech-to-text, text-to-speech, and large language model (LLM) providers. The codebase contains provider interfaces and light mock implementations suitable for local development and testing.

What this repo contains

  • index.ts — Entry point that wires components together.
  • marty-comm.ts — Communication helpers for interacting with Marty.
  • types.ts — Shared TypeScript types for LLMRequest, LLMResponse, and settings.
  • config/defaults.ts — Default configuration values.
  • interaction/ — Interaction control flow:
    • index.ts — High-level interaction orchestration.
    • interaction-controller.ts — Core logic for managing interaction state.
    • push-to-talk.ts — Push-to-talk implementation.
    • wake-words.ts — Wake-word detection helpers (interface / mock).
  • llm/ — LLM provider abstractions and implementations:
    • index.ts — Exports for LLM providers.
    • llm-provider.tsLLMProvider interface.
    • mock-llm.ts — Local stub provider for development.
    • openai-llm-provider.ts — OpenAI Chat Completions integration.
    • swiss-ai-llm-provider.ts — Swiss-AI Apertus 70B instruct integration via Hugging Face Inference.
  • stt/ — Speech-to-text providers and mocks.
  • transcript/ — Transcript manager.
  • tts/ — Text-to-speech providers and mocks.
  • utils/event-emitter.ts — Small event emitter utility.

Status

This library provides interfaces and reference implementations intended for local testing and development. Real providers are included for OpenAI's Chat Completions API and the Swiss-AI Apertus 70B instruct model via Hugging Face Inference; both require valid API keys.

Getting Started

Prerequisites: Node.js and npm or yarn. This project is TypeScript-based.

  1. Install dependencies (this project includes package.json):
npm install
# or
yarn install

If you want to load API keys from a .env file during development, copy the example file:

cp .env.example .env
# then edit .env with your keys
  1. Build TypeScript (if configured):
npx tsc
  1. Run the development example (watches with ts-node-dev):
npm run dev
  1. Build the library for publishing or local consumption:
npm run build

After npm run build, the compiled library will be published to the dist/ folder; other projects can import it by referencing the package (after publishing) or by using a local file path.

Dotenv: the example.ts will attempt to load a .env file using dotenv when present. dotenv is an optional dependency — you can also export environment variables directly in your shell.

Choosing an LLM Provider

Set the MARTY_LLM_PROVIDER environment variable to switch between bundled providers:

  • openai (default) — uses OPENAI_API_KEY or OPENAI_API_BASE if you need a custom endpoint.
  • swiss-ai — uses the Hugging Face Inference API for swiss-ai/Apertus-70B-Instruct-2509. Provide a token via SWISS_AI_API_KEY, HUGGINGFACE_API_KEY, HUGGINGFACEHUB_API_TOKEN, or HF_TOKEN.

When no value is provided, the example defaults to OpenAI. You can still instantiate providers directly in your own code.

Where to Hook Real Providers

  • Replace llm/openai-llm-provider.ts generateResponse implementation with real OpenAI API calls.
  • Implement stt/openai-whisper-provider.ts with network calls to Whisper or another STT service.
  • Replace tts/elevenlabs-tts.ts mock sections with the real ElevenLabs or another TTS provider.

Tests

There are no automated tests in the repository by default. Add unit tests that exercise provider interfaces and interaction/interaction-controller.ts.

Contributing

Open a PR with a clear description of the changes. Keep provider implementations behind interfaces so they can be swapped for mocks in tests.

License

No license file is included. Add a LICENSE file if you plan to publish or share this project publicly.

About

A nodejs library that handles tts, llm, and stt for talking with marty

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published