Skip to content

OpenAI clients for kotlin multiplatform SDK, supports OpenAI, Anthropic, Azure, Gemini, Ollama with TDD

License

Notifications You must be signed in to change notification settings

tddworks/openai-kotlin

Repository files navigation

OpenAI Kotlin

CI codecov Maven Central Kotlin KMP Ktor License GitHub Release Issues Stars

A comprehensive Kotlin Multiplatform library providing unified access to multiple AI/LLM providers including OpenAI, Anthropic Claude, Google Gemini, and Ollama. Built with modern Kotlin practices and designed for seamless integration across JVM, Android, iOS, and native platforms.

✨ Features

  • πŸ”— Multi-Provider Support: OpenAI, Anthropic Claude, Google Gemini, Ollama, and custom providers
  • 🌐 Kotlin Multiplatform: JVM, Android, iOS, macOS, and other Kotlin/Native targets
  • πŸš€ Streaming Support: Real-time chat completions with Flow-based streaming
  • πŸ”„ Unified Gateway: Switch between providers seamlessly with a single interface
  • πŸ“± Platform-Optimized: Native HTTP clients for each platform (Ktor CIO, NSURLSession)
  • 🎯 Type-Safe: Fully typed APIs with comprehensive data classes
  • πŸ“¦ Modular Design: Use only what you need with granular dependencies
  • πŸ§ͺ Well-Tested: Comprehensive test coverage with integration tests

πŸš€ Quick Start

Basic OpenAI Client

Add the dependency:

implementation("com.tddworks:openai-client-jvm:0.2.3")
import com.tddworks.openai.api.OpenAIConfig
import com.tddworks.openai.api.chat.api.*
import com.tddworks.openai.di.initOpenAI

val openAI = initOpenAI(
    OpenAIConfig(
        apiKey = { "your-api-key" }
    )
)

// Chat completion
val response = openAI.chatCompletions(
    ChatCompletionRequest(
        messages = listOf(ChatMessage.UserMessage("Hello, world!")),
        model = Model.GPT_4O,
        maxTokens = 1000
    )
)

// Streaming chat completion
openAI.streamChatCompletions(
    ChatCompletionRequest(
        messages = listOf(ChatMessage.UserMessage("Tell me a story")),
        model = Model.GPT_4O
    )
).collect { chunk ->
    print(chunk.choices?.firstOrNull()?.delta?.content ?: "")
}

Multi-Provider Gateway

For applications requiring multiple AI providers:

implementation("com.tddworks:openai-gateway-jvm:0.2.3")
import com.tddworks.openai.gateway.di.initOpenAIGateway
import com.tddworks.openai.gateway.api.*

val gateway = initOpenAIGateway(
    defaultProvider = DefaultOpenAIProviderConfig(
        apiKey = { "openai-api-key" }
    ),
    anthropicProvider = AnthropicOpenAIProviderConfig(
        apiKey = { "anthropic-api-key" }
    ),
    ollamaProvider = OllamaOpenAIProviderConfig(
        baseUrl = { "localhost" },
        port = { 11434 }
    )
)

// Use any provider with the same interface
val response = gateway.chatCompletions(
    ChatCompletionRequest(
        messages = listOf(ChatMessage.UserMessage("Compare AI models")),
        model = Model("claude-3-sonnet") // or "llama2", "gpt-4", etc.
    )
)

πŸ“¦ Installation

Gradle (Kotlin DSL)

For multiplatform projects:

kotlin {
    sourceSets {
        commonMain.dependencies {
            implementation("com.tddworks:openai-client-core:0.2.3")
            implementation("com.tddworks:openai-gateway-core:0.2.3")
        }
    }
}

For JVM/Android projects:

dependencies {
    implementation("com.tddworks:openai-client-jvm:0.2.3")
    implementation("com.tddworks:anthropic-client-jvm:0.2.3")
    implementation("com.tddworks:ollama-client-jvm:0.2.3")
    implementation("com.tddworks:gemini-client-jvm:0.2.3")
    implementation("com.tddworks:openai-gateway-jvm:0.2.3")
}

Maven

<dependency>
    <groupId>com.tddworks</groupId>
    <artifactId>openai-client-jvm</artifactId>
    <version>0.2.3</version>
</dependency>

Available Modules

Module Description Platforms
openai-client-* OpenAI API client (chat, images, completions) JVM, iOS, macOS
anthropic-client-* Anthropic Claude API client JVM, iOS, macOS
ollama-client-* Ollama local LLM client JVM, iOS, macOS
gemini-client-* Google Gemini API client JVM, iOS, macOS
openai-gateway-* Multi-provider gateway JVM, iOS, macOS
common Shared networking utilities All platforms

πŸ’‘ Usage Examples

Image Generation

val images = openAI.images(
    ImageCreate(
        prompt = "A beautiful sunset over mountains",
        size = Size.SIZE_1024x1024,
        quality = Quality.HD,
        n = 1
    )
)

Vision (Image Analysis)

val response = openAI.chatCompletions(
    ChatCompletionRequest(
        messages = listOf(
            ChatMessage.UserMessage(
                content = listOf(
                    VisionMessageContent.TextContent("What's in this image?"),
                    VisionMessageContent.ImageContent(
                        imageUrl = ImageUrl("data:image/jpeg;base64,${base64Image}")
                    )
                )
            )
        ),
        model = Model.GPT_4_VISION,
        maxTokens = 1000
    )
)

Anthropic Claude

val claude = initAnthropic(
    AnthropicConfig(apiKey = { "your-anthropic-key" })
)

val message = claude.messages(
    CreateMessageRequest(
        messages = listOf(
            Message(
                role = Role.USER,
                content = listOf(ContentMessage.TextContent("Explain quantum computing"))
            )
        ),
        model = AnthropicModel.CLAUDE_3_SONNET,
        maxTokens = 1000
    )
)

Local Ollama

val ollama = initOllama(
    OllamaConfig(baseUrl = "localhost", port = 11434)
)

val response = ollama.chat(
    OllamaChatRequest(
        model = OllamaModel.LLAMA2.value,
        messages = listOf(
            OllamaChatMessage(
                role = "user",
                content = "What is the capital of France?"
            )
        )
    )
)

πŸ—οΈ Architecture

This library follows a clean, modular architecture:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Applications      β”‚ (Your Kotlin/Java/Swift apps)
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚   OpenAI Gateway    β”‚ (Unified interface for all providers)
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚   Provider Clients  β”‚ (OpenAI, Anthropic, Ollama, Gemini)
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚   Common Networking β”‚ (HTTP abstraction, serialization)
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Core Components

  • HttpRequester: Cross-platform HTTP client abstraction using Ktor
  • Provider Configs: Type-safe configuration for each AI provider
  • Streaming Support: Flow-based streaming for real-time responses
  • Error Handling: Comprehensive exception hierarchy with detailed error information
  • Dependency Injection: Koin-based DI for clean separation of concerns

🌍 Platform Support

Supported Platforms

  • βœ… JVM (Java 8+, Android API 21+)
  • βœ… iOS (iOS 14+)
  • βœ… macOS (macOS 11+)
  • 🚧 watchOS (planned)
  • 🚧 tvOS (planned)
  • 🚧 Linux (planned)
  • 🚧 Windows (planned)

Platform-Specific Features

Platform HTTP Client Streaming Local Storage
JVM Ktor CIO βœ… File System
Android Ktor CIO βœ… File System
iOS NSURLSession βœ… UserDefaults
macOS NSURLSession βœ… UserDefaults

πŸ”§ Configuration

Environment Variables

Set these environment variables or provide them programmatically:

OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GEMINI_API_KEY=your-gemini-key

Configuration Examples

OpenAI with Custom Base URL

val openAI = initOpenAI(
    OpenAIConfig(
        baseUrl = { "https://api.openai.com/v1" },
        apiKey = { System.getenv("OPENAI_API_KEY") },
        organization = { "your-org-id" } // optional
    )
)

Anthropic with Custom Headers

val anthropic = initAnthropic(
    AnthropicConfig(
        apiKey = { System.getenv("ANTHROPIC_API_KEY") },
        anthropicVersion = { "2023-06-01" },
        baseUrl = { "https://api.anthropic.com" }
    )
)

πŸ§ͺ Testing

Unit Tests

./gradlew test

Integration Tests

./gradlew integrationTest

Code Coverage

./gradlew koverHtmlReport
open build/reports/kover/html/index.html

🀝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

Development Setup

  1. Clone the repository:

    git clone https://github.com/tddworks/openai-kotlin.git
    cd openai-kotlin
  2. Build the project:

    ./gradlew build
  3. Run tests:

    ./gradlew allTests
  4. Format code:

    ./gradlew spotlessApply

Code Style

This project uses Spotless for code formatting. Please run ./gradlew spotlessApply before submitting PRs.

πŸ“„ License

Copyright 2024 TDD Works

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

πŸ™‹ Support

🌟 Acknowledgments


Made with ❀️ by TDD Works

Sponsor this project

 

Packages

 
 
 

Languages