Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Access to all public properties from a service point of view #88

Open
rodrigovaras opened this issue Jun 25, 2024 · 2 comments
Open

Access to all public properties from a service point of view #88

rodrigovaras opened this issue Jun 25, 2024 · 2 comments

Comments

@rodrigovaras
Copy link

Our .NET service wants to receive OpenAI REST payloads and then hub the request to different specific service implementations. We were using the Azure OpenAI API (which is now deprecated for this project).
That library allows us to process every property on the chat request and also construct chat responses.
For some reason the code generation on this project wants to hide everything possible targeting only the client scenario.
We want to use this project but its almost impossible, one example:

public partial class ChatCompletionOptions
{
// CUSTOM:
// - Made internal. This value comes from a parameter on the client method.
// - Added setter.
///


/// A list of messages comprising the conversation so far. Example Python code.
/// Please note is the base class. According to the scenario, a derived class of the base class might need to be assigned here, or this property needs to be casted to one of the possible derived classes.
/// The available derived classes include , , , and .
///

[CodeGenMember("Messages")]
internal IList Messages { get; set; }

Our service can't access the Messages property, but in theory we can deserialize the payload.

Bottom line, we are forced to use the Betalgo OpenAI.

Any plan to support a service scenario?

@joseharriaga
Copy link
Collaborator

joseharriaga commented Jun 25, 2024

Thank you for reaching out, @rodrigovaras! I would like to ask a few clarifying questions:

  1. Are you saying that you want strongly-typed classes that map directly to the body parameters of each request? What about other types of parameters such as path and/or query parameters?
  2. Are you using the clients in the OpenAI library too, or are you only using the classes that represent requests and responses?

@rodrigovaras
Copy link
Author

  1. We want type safe payloads, no need for path and query
  2. No clients (for now) is used.

We could eventually use the client type classes for one scenario that the provider is itself an external opneAI service that we will route the call. We currently have our own way to route the json payload in such scenario. For now we are not planning to use the client apis but only the 'models' classes to properly deserialize a JSON payload and then produce a response depending on the provider the request hit (mostly based on the model name).

Here is the snippet code on our minimal API.
We would love to replace the type ChatCompletionCreateRequest (from Betalgo package) to this project 'ChatCompletionOptions'.

    // Open AI services support
    var chatApi = app.MapGroup("/v1/chat/completions");
    chatApi.MapPost("/", async (OpenAIServiceComposite openAIServiceComposite, HttpRequest request, ILogger<ChatApi> logger, CancellationToken cancellationToken) =>
    {

        ChatCompletionCreateRequest chatRequest;
        try
        {
            chatRequest = await request.ReadFromJsonAsync<ChatCompletionCreateRequest>(cancellationToken) ?? throw new BadHttpRequestException("JSON can't be null");
        }
        catch (Exception ex)
        {
            logger.LogError(ex, invalidOpenAIJsonMessage);
            return Results.Problem(
                statusCode: StatusCodes.Status500InternalServerError,
                title: invalidOpenAIJsonMessage,
                detail: ex.Message);
        }

        try
        {
            if (chatRequest.Stream == true)
            {
                return openAIServiceComposite.GetOpenAIServiceProvider(chatRequest.Model!).HandleStreamRequest(chatRequest, cancellationToken);
            }

            var result = await openAIServiceComposite.HandleChatCompletionRequestAsync(chatRequest, cancellationToken);
            return Results.Text(result.ToJson(), contentType: MediaTypeNames.Application.Json);
        }
        catch (Exception ex)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants