A comprehensive Kotlin Multiplatform library providing unified access to multiple AI/LLM providers including OpenAI, Anthropic Claude, Google Gemini, and Ollama. Built with modern Kotlin practices and designed for seamless integration across JVM, Android, iOS, and native platforms.
- π Multi-Provider Support: OpenAI, Anthropic Claude, Google Gemini, Ollama, and custom providers
- π Kotlin Multiplatform: JVM, Android, iOS, macOS, and other Kotlin/Native targets
- π Streaming Support: Real-time chat completions with Flow-based streaming
- π Unified Gateway: Switch between providers seamlessly with a single interface
- π± Platform-Optimized: Native HTTP clients for each platform (Ktor CIO, NSURLSession)
- π― Type-Safe: Fully typed APIs with comprehensive data classes
- π¦ Modular Design: Use only what you need with granular dependencies
- π§ͺ Well-Tested: Comprehensive test coverage with integration tests
Add the dependency:
implementation("com.tddworks:openai-client-jvm:0.2.3")
import com.tddworks.openai.api.OpenAIConfig
import com.tddworks.openai.api.chat.api.*
import com.tddworks.openai.di.initOpenAI
val openAI = initOpenAI(
OpenAIConfig(
apiKey = { "your-api-key" }
)
)
// Chat completion
val response = openAI.chatCompletions(
ChatCompletionRequest(
messages = listOf(ChatMessage.UserMessage("Hello, world!")),
model = Model.GPT_4O,
maxTokens = 1000
)
)
// Streaming chat completion
openAI.streamChatCompletions(
ChatCompletionRequest(
messages = listOf(ChatMessage.UserMessage("Tell me a story")),
model = Model.GPT_4O
)
).collect { chunk ->
print(chunk.choices?.firstOrNull()?.delta?.content ?: "")
}
For applications requiring multiple AI providers:
implementation("com.tddworks:openai-gateway-jvm:0.2.3")
import com.tddworks.openai.gateway.di.initOpenAIGateway
import com.tddworks.openai.gateway.api.*
val gateway = initOpenAIGateway(
defaultProvider = DefaultOpenAIProviderConfig(
apiKey = { "openai-api-key" }
),
anthropicProvider = AnthropicOpenAIProviderConfig(
apiKey = { "anthropic-api-key" }
),
ollamaProvider = OllamaOpenAIProviderConfig(
baseUrl = { "localhost" },
port = { 11434 }
)
)
// Use any provider with the same interface
val response = gateway.chatCompletions(
ChatCompletionRequest(
messages = listOf(ChatMessage.UserMessage("Compare AI models")),
model = Model("claude-3-sonnet") // or "llama2", "gpt-4", etc.
)
)
For multiplatform projects:
kotlin {
sourceSets {
commonMain.dependencies {
implementation("com.tddworks:openai-client-core:0.2.3")
implementation("com.tddworks:openai-gateway-core:0.2.3")
}
}
}
For JVM/Android projects:
dependencies {
implementation("com.tddworks:openai-client-jvm:0.2.3")
implementation("com.tddworks:anthropic-client-jvm:0.2.3")
implementation("com.tddworks:ollama-client-jvm:0.2.3")
implementation("com.tddworks:gemini-client-jvm:0.2.3")
implementation("com.tddworks:openai-gateway-jvm:0.2.3")
}
<dependency>
<groupId>com.tddworks</groupId>
<artifactId>openai-client-jvm</artifactId>
<version>0.2.3</version>
</dependency>
Module | Description | Platforms |
---|---|---|
openai-client-* |
OpenAI API client (chat, images, completions) | JVM, iOS, macOS |
anthropic-client-* |
Anthropic Claude API client | JVM, iOS, macOS |
ollama-client-* |
Ollama local LLM client | JVM, iOS, macOS |
gemini-client-* |
Google Gemini API client | JVM, iOS, macOS |
openai-gateway-* |
Multi-provider gateway | JVM, iOS, macOS |
common |
Shared networking utilities | All platforms |
val images = openAI.images(
ImageCreate(
prompt = "A beautiful sunset over mountains",
size = Size.SIZE_1024x1024,
quality = Quality.HD,
n = 1
)
)
val response = openAI.chatCompletions(
ChatCompletionRequest(
messages = listOf(
ChatMessage.UserMessage(
content = listOf(
VisionMessageContent.TextContent("What's in this image?"),
VisionMessageContent.ImageContent(
imageUrl = ImageUrl("data:image/jpeg;base64,${base64Image}")
)
)
)
),
model = Model.GPT_4_VISION,
maxTokens = 1000
)
)
val claude = initAnthropic(
AnthropicConfig(apiKey = { "your-anthropic-key" })
)
val message = claude.messages(
CreateMessageRequest(
messages = listOf(
Message(
role = Role.USER,
content = listOf(ContentMessage.TextContent("Explain quantum computing"))
)
),
model = AnthropicModel.CLAUDE_3_SONNET,
maxTokens = 1000
)
)
val ollama = initOllama(
OllamaConfig(baseUrl = "localhost", port = 11434)
)
val response = ollama.chat(
OllamaChatRequest(
model = OllamaModel.LLAMA2.value,
messages = listOf(
OllamaChatMessage(
role = "user",
content = "What is the capital of France?"
)
)
)
)
This library follows a clean, modular architecture:
βββββββββββββββββββββββ
β Applications β (Your Kotlin/Java/Swift apps)
βββββββββββββββββββββββ€
β OpenAI Gateway β (Unified interface for all providers)
βββββββββββββββββββββββ€
β Provider Clients β (OpenAI, Anthropic, Ollama, Gemini)
βββββββββββββββββββββββ€
β Common Networking β (HTTP abstraction, serialization)
βββββββββββββββββββββββ
- HttpRequester: Cross-platform HTTP client abstraction using Ktor
- Provider Configs: Type-safe configuration for each AI provider
- Streaming Support: Flow-based streaming for real-time responses
- Error Handling: Comprehensive exception hierarchy with detailed error information
- Dependency Injection: Koin-based DI for clean separation of concerns
- β JVM (Java 8+, Android API 21+)
- β iOS (iOS 14+)
- β macOS (macOS 11+)
- π§ watchOS (planned)
- π§ tvOS (planned)
- π§ Linux (planned)
- π§ Windows (planned)
Platform | HTTP Client | Streaming | Local Storage |
---|---|---|---|
JVM | Ktor CIO | β | File System |
Android | Ktor CIO | β | File System |
iOS | NSURLSession | β | UserDefaults |
macOS | NSURLSession | β | UserDefaults |
Set these environment variables or provide them programmatically:
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GEMINI_API_KEY=your-gemini-key
val openAI = initOpenAI(
OpenAIConfig(
baseUrl = { "https://api.openai.com/v1" },
apiKey = { System.getenv("OPENAI_API_KEY") },
organization = { "your-org-id" } // optional
)
)
val anthropic = initAnthropic(
AnthropicConfig(
apiKey = { System.getenv("ANTHROPIC_API_KEY") },
anthropicVersion = { "2023-06-01" },
baseUrl = { "https://api.anthropic.com" }
)
)
./gradlew test
./gradlew integrationTest
./gradlew koverHtmlReport
open build/reports/kover/html/index.html
We welcome contributions! Please see our Contributing Guidelines for details.
-
Clone the repository:
git clone https://github.com/tddworks/openai-kotlin.git cd openai-kotlin
-
Build the project:
./gradlew build
-
Run tests:
./gradlew allTests
-
Format code:
./gradlew spotlessApply
This project uses Spotless for code formatting. Please run ./gradlew spotlessApply
before submitting PRs.
Copyright 2024 TDD Works
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
- π Documentation
- π¬ GitHub Discussions
- π Issue Tracker
- π§ Email: [email protected]
- OpenAI for their powerful APIs
- Anthropic for Claude AI
- Ollama for local LLM support
- Google for Gemini API
- JetBrains for Kotlin Multiplatform
- Ktor for cross-platform HTTP client
Made with β€οΈ by TDD Works