Skip to content

Releases: jamesrochabrun/SwiftOpenAI

SwiftOpenAI v3.9.2

12 Nov 19:17
c3a04bb
Compare
Choose a tag to compare

Gemini

Screenshot 2024-11-12 at 10 53 43 AM

Gemini is now accessible from the OpenAI Library. Announcement .
SwiftOpenAI support all OpenAI endpoints, however Please refer to Gemini documentation to understand which API's are currently compatible'

Gemini is now accessible through the OpenAI Library. See the announcement here.
SwiftOpenAI supports all OpenAI endpoints. However, please refer to the Gemini documentation to understand which APIs are currently compatible."

You can instantiate a OpenAIService using your Gemini token like this...

let geminiAPIKey = "your_api_key"
let baseURL = "https://generativelanguage.googleapis.com"
let version = "v1beta"

let service = OpenAIServiceFactory.service(
   apiKey: apiKey, 
   overrideBaseURL: baseURL, 
   overrideVersion: version)

You can now create a chat request using the .custom model parameter and pass the model name as a string.

let parameters = ChatCompletionParameters(
      messages: [.init(
      role: .user,
      content: content)],
      model: .custom("gemini-1.5-flash"))

let stream = try await service.startStreamedChat(parameters: parameters)

SwiftOpenAI v3.9.1

28 Oct 06:42
Compare
Choose a tag to compare

SwiftOpenAI v3.9.0

18 Oct 21:45
2b6f26c
Compare
Choose a tag to compare
  • Added store parameters for new Evals framework
  • Added support for Chat Completions Audio generation.
Screenshot 2024-10-18 at 2 45 08 PM

What's Changed

Full Changelog: v3.8.2...v3.9.0

SwiftOpenAI v3.8.2

12 Oct 06:53
2e74797
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v3.8.1...v3.8.2

SwiftOpenAI v3.8.1

28 Sep 05:42
bc84564
Compare
Choose a tag to compare

What's Changed

Full Changelog: v3.8.0...v3.8.1

SwiftOpenAI v3.8.0

12 Sep 18:37
Compare
Choose a tag to compare
Screenshot 2024-09-12 at 11 36 00 AM

Adding 01 models to Models list.

⚠️ Important :

Screenshot 2024-09-12 at 11 36 35 AM

Other:

Fixes #84

SwiftOpenAI v3.7.0

26 Aug 06:34
Compare
Choose a tag to compare

Fixing version issues.

What's Changed

Full Changelog: v3.6.2...v3.7

SwiftOpenAI v3.4.0

12 Aug 23:16
Compare
Choose a tag to compare
Screenshot 2024-08-12 at 4 15 39 PM

Structured Outputs

Documentation:

Must knowns:

  • All fields must be required , To use Structured Outputs, all fields or function parameters must be specified as required.

  • Although all fields must be required (and the model will return a value for each parameter), it is possible to emulate an optional parameter by using a union type with null.

  • Objects have limitations on nesting depth and size, A schema may have up to 100 object properties total, with up to 5 levels of nesting.

  • additionalProperties): false must always be set in objects
    additionalProperties controls whether it is allowable for an object to contain additional keys / values that were not defined in the JSON Schema.
    Structured Outputs only supports generating specified keys / values, so we require developers to set additionalProperties: false to opt into Structured Outputs.

  • Key ordering, When using Structured Outputs, outputs will be produced in the same order as the ordering of keys in the schema.

  • Recursive schemas are supported

How to use Structured Outputs in SwiftOpenAI

  1. Function calling: Structured Outputs via tools is available by setting strict: true within your function definition. This feature works with all models that support tools, including all models gpt-4-0613 and gpt-3.5-turbo-0613 and later. When Structured Outputs are enabled, model outputs will match the supplied tool definition.

Using this schema:

{
  "schema": {
    "type": "object",
    "properties": {
      "steps": {
        "type": "array",
        "items": {
          "type": "object",
          "properties": {
            "explanation": {
              "type": "string"
            },
            "output": {
              "type": "string"
            }
          },
          "required": ["explanation", "output"],
          "additionalProperties": false
        }
      },
      "final_answer": {
        "type": "string"
      }
    },
    "required": ["steps", "final_answer"],
    "additionalProperties": false
  }
}

You can use the convenient JSONSchema object like this:

// 1: Define the Step schema object

let stepSchema = JSONSchema(
   type: .object,
   properties: [
      "explanation": JSONSchema(type: .string),
      "output": JSONSchema(
         type: .string)
   ],
   required: ["explanation", "output"],
   additionalProperties: false
)

// 2. Define the steps Array schema.

let stepsArraySchema = JSONSchema(type: .array, items: stepSchema)

// 3. Define the final Answer schema.

let finalAnswerSchema = JSONSchema(type: .string)

// 4. Define math reponse JSON schema.

let mathResponseSchema = JSONSchema(
      type: .object,
      properties: [
         "steps": stepsArraySchema,
         "final_answer": finalAnswerSchema
      ],
      required: ["steps", "final_answer"],
      additionalProperties: false
)

let tool = ChatCompletionParameters.Tool(
            function: .init(
               name: "math_response",
               strict: true,
               parameters: mathResponseSchema))
)

let prompt = "solve 8x + 31 = 2"
let systemMessage = ChatCompletionParameters.Message(role: .system, content: .text("You are a math tutor"))
let userMessage = ChatCompletionParameters.Message(role: .user, content: .text(prompt))
let parameters = ChatCompletionParameters(
   messages: [systemMessage, userMessage],
   model: .gpt4o20240806,
   tools: [tool])

let chat = try await service.startChat(parameters: parameters)
  1. A new option for the response_format parameter: developers can now supply a JSON Schema via json_schema, a new option for the response_format parameter. This is useful when the model is not calling a tool, but rather, responding to the user in a structured way. This feature works with our newest GPT-4o models: gpt-4o-2024-08-06, released today, and gpt-4o-mini-2024-07-18. When a response_format is supplied with strict: true, model outputs will match the supplied schema.

Using the previous schema, this is how you can implement it as json schema using the convenient JSONSchemaResponseFormat object:

// 1: Define the Step schema object

let stepSchema = JSONSchema(
   type: .object,
   properties: [
      "explanation": JSONSchema(type: .string),
      "output": JSONSchema(
         type: .string)
   ],
   required: ["explanation", "output"],
   additionalProperties: false
)

// 2. Define the steps Array schema.

let stepsArraySchema = JSONSchema(type: .array, items: stepSchema)

// 3. Define the final Answer schema.

let finalAnswerSchema = JSONSchema(type: .string)

// 4. Define the response format JSON schema.

let responseFormatSchema = JSONSchemaResponseFormat(
   name: "math_response",
   strict: true,
   schema: JSONSchema(
      type: .object,
      properties: [
         "steps": stepsArraySchema,
         "final_answer": finalAnswerSchema
      ],
      required: ["steps", "final_answer"],
      additionalProperties: false
   )
)

let prompt = "solve 8x + 31 = 2"
let systemMessage = ChatCompletionParameters.Message(role: .system, content: .text("You are a math tutor"))
let userMessage = ChatCompletionParameters.Message(role: .user, content: .text(prompt))
let parameters = ChatCompletionParameters(
   messages: [systemMessage, userMessage],
   model: .gpt4o20240806,
   responseFormat: .jsonSchema(responseFormatSchema))

SwiftOpenAI Structred outputs supports:

  • Tools Structured output.
  • Response format Structure output.
  • Recursive Schema.
  • Optional values Schema.
  • Pydantic models.

We don't support Pydantic models, users need tos manually create Schemas using JSONSchema or JSONSchemaResponseFormat objects.

Pro tip 🔥 Use iosAICodeAssistant GPT to construct SwifOpenAI schemas. Just paste your JSON schema and ask the GPT to create SwiftOpenAI schemas for tools and response format.

For more details visit the Demo project for tools and response format.

What's Changed Summary:

Full Changelog: v3.6.2...v3.4.0

SwiftOpenAI v3.6.2

03 Aug 07:15
e4b6405
Compare
Choose a tag to compare

What's Changed

Full Changelog: v3.6.1...v3.6.2

AIProxy bug fix

22 Jul 01:47
392d01e
Compare
Choose a tag to compare

If the developer does not provide an AIProxy serviceURL in their integration, we want to fall back to api.aiproxy.pro and not http://Lous-MacBook-Air-3.local:4000 😅
#64