Skip to content

[Go] Inaccurate and Incomplete Documentation for OpenAI-Compatible Providers in Genkit Go SDK #3583

@geocine

Description

@geocine

The documentation for integrating OpenAI-compatible providers (e.g., OpenRouter) in the Genkit Go SDK is misleading and does not reflect the actual implementation. The suggested configuration methods fail, and users must resort to undocumented workarounds. Key issues include:

  1. The recommended option.RequestOption setup (e.g., WithBaseURL) does not integrate properly with Genkit's Generate function.
  2. The compat_oai.OpenAIConfig type referenced in the docs does not exist in the compat_oai package.
  3. The Provider field in OpenAICompatible can be set to arbitrary values (e.g., "openai" or "geocine"), but the provider prefix is unexpectedly trimmed from model names during resolution, breaking custom provider setups. This is caused by trimming logic in the source code.

These issues prevent straightforward integration with providers like OpenRouter and lead to errors such as model resolution failures.

Links

Reproduction Steps

Issue 1: Incorrect Configuration for Compatible Providers

  1. Follow the documentation's "Configuration > Use with Compatible Providers" section.
  2. Set up options with option.WithBaseURL("https://openrouter.ai/api/v1") and an API key.
  3. Attempt to use genkit.Generate with a model like "openai/gpt-3.5-turbo" (adapted for OpenRouter).
  4. Observe failure: The base URL is not respected, or generation errors occur due to improper plugin initialization.

Expected Behavior

  • The option.RequestOption (including WithBaseURL) should configure the client for the custom endpoint.
  • Generation should route requests to the custom provider (e.g., OpenRouter) without additional plugin setup.

Actual Behavior

  • Requests fail or default to OpenAI's endpoint.
  • Workaround: Manually create an OpenAICompatible plugin instance with Opts (including WithBaseURL) and a Provider field (e.g., "openai"). Set the model name to include the provider prefix, like "openai/openrouter/sonoma-dusk-alpha".
  • Even with this, the provider prefix ("openai/") is trimmed from the model name in the internal resolution logic, causing mismatches for custom providers.

Working Example (Undocumented)

package main

import (
    "context"
    "log"
    "strings"

    "github.com/firebase/genkit/go/ai"
    "github.com/firebase/genkit/go/genkit"
    "github.com/openai/openai-go/option"
    "github.com/firebase/genkit/go/plugins/compat_oai"
)

func main() {
    // Setup plugin (Provider prefixes models)
    opts := []option.RequestOption{
        option.WithAPIKey("YOUR_OPENROUTER_API_KEY"),
        option.WithBaseURL("https://openrouter.ai/api/v1"),
    }
    plugin := &compat_oai.OpenAICompatible{
        Opts:     opts,
        Provider: "openai", // Arbitrary; works but gets trimmed
    }

    // Init Genkit (initializes client in plugin.Init; sets default)
    modelName := "openai/openrouter/sonoma-dusk-alpha" // Provider prefix required but trimmed internally
    g := genkit.Init(context.Background(),
        genkit.WithPlugins(plugin),
        genkit.WithDefaultModel(modelName), // Full name; resolved dynamically
    )
    log.Println("Genkit initialized. Default model:", modelName)

    // Generate (uses default; Messages for chat format)
    prompt := "Tell me a joke about Go programming."
    resp, err := genkit.Generate(context.Background(), g,
        ai.WithPrompt(prompt),
    )
    if err != nil {
        log.Printf("Generate failed: %v", err)
        if strings.Contains(err.Error(), "401") || strings.Contains(err.Error(), "invalid_api_key") {
            log.Println("Debug: Invalid API key. Regenerate at openrouter.ai.")
        } else if strings.Contains(err.Error(), "404") || strings.Contains(err.Error(), "model") {
            log.Println("Debug: Invalid model. Use one from OpenRouter's list (update modelName).")
        } else if strings.Contains(err.Error(), "429") {
            log.Println("Debug: Rate limited. Wait or check OpenRouter usage.")
        }
        return
    }
    log.Println("Success! Response:", resp.Text())
}
  • This works for OpenRouter, but the Provider: "openai" is arbitrary (can be "geocine", etc.), and the prefix is trimmed, breaking truly custom providers.
  • Example with custom provider: Replace "openai" with "geocine" in both Provider and model name ("geocine/openrouter/sonoma-dusk-alpha"), but trimming still occurs.

Issue 2: Non-Existent OpenAIConfig Type

  1. Follow the documentation's "Common Configuration" section.
  2. Attempt to create config := &compat_oai.OpenAIConfig{ ... } with fields like Temperature, MaxOutputTokens, etc.
  3. Pass it to genkit.Generate via ai.WithConfig(config).

Expected Behavior

  • compat_oai.OpenAIConfig should be a valid type for passing generation parameters.

Actual Behavior

  • Compilation error: OpenAIConfig does not exist in compat_oai.
  • Correct approach (undocumented): Use openai.ChatCompletionNewParams from the openai-go library.

Incorrect Example (from Docs)

import "github.com/firebase/genkit/go/plugins/compat_oai"

config := &compat_oai.OpenAIConfig{
    Temperature:     0.7,
    MaxOutputTokens: 1000,
    TopP:            0.9,
    StopSequences:   []string{"END"},
}

resp, err := genkit.Generate(ctx, g,
    ai.WithModel(model),
    ai.WithPrompt("Your prompt here"),
    ai.WithConfig(config),
)

Corrected Example

import (
    "github.com/openai/openai-go"
    // ... other imports
)

resp, err := genkit.Generate(ctx, g,
    ai.WithPrompt(prompt),
    ai.WithConfig(&openai.ChatCompletionNewParams{
        MaxCompletionTokens: openai.Int(2000),
        Temperature:         openai.Float(0.7),
        // Add other params like TopP, StopSequences as needed
    }),
)
  • Note: Parameter names and types differ (e.g., MaxCompletionTokens instead of MaxOutputTokens).

Environment

  • Genkit Version: v1.0.1
  • Go Version: go1.24.5
  • Provider: OpenRouter (tested with openrouter/sonoma-dusk-alpha)
  • OS: Windows

Suggested Fixes

  1. Update documentation to reflect the OpenAICompatible plugin setup as the standard for custom providers.
  2. Remove or make the provider trimming logic optional in the code (e.g., in compat_oai.go), as it assumes standard providers and breaks custom ones. The code should not assume provider trimming for compatibility, since the Provider can be any arbitrary value.
  3. Replace OpenAIConfig references with openai.ChatCompletionNewParams examples.
  4. Add official examples for OpenRouter and custom provider integrations, including full code snippets for plugin setup, model naming conventions (with and without prefixes), error handling, and configuration parameters.
  5. Provide a configuration option to control or disable provider prefix trimming for advanced use cases with custom providers.

Additional Context

  • The trimming in compat_oai.go (line 103) assumes standard providers but breaks custom ones.
  • Tested with OpenRouter API key and models from their dashboard.
  • No errors in basic OpenAI usage; issues are specific to compatible providers.

Please let me know if more details or a minimal repro project are needed!

Metadata

Metadata

Assignees

Labels

bugSomething isn't workinggo

Type

No type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions