Skip to content

Commit

Permalink
Merge pull request #49 from samchon/feature/v31-array-items-optional
Browse files Browse the repository at this point in the history
Fix `OpenApiV3_1.IJsonSchema.IArray.items` to be optional.
  • Loading branch information
samchon authored Sep 13, 2024
2 parents c3c995d + 11b504a commit 6b55172
Show file tree
Hide file tree
Showing 4 changed files with 42 additions and 12 deletions.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@samchon/openapi",
"version": "1.0.0",
"version": "1.0.1",
"description": "OpenAPI definitions and converters for 'typia' and 'nestia'.",
"main": "./lib/index.js",
"module": "./lib/index.mjs",
Expand Down
2 changes: 1 addition & 1 deletion src/OpenApiV3_1.ts
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,7 @@ export namespace OpenApiV3_1 {
}

export interface IArray extends __ISignificant<"array"> {
items: IJsonSchema | IJsonSchema[];
items?: IJsonSchema | IJsonSchema[];
prefixItems?: IJsonSchema[];
uniqueItems?: boolean;
additionalItems?: boolean | IJsonSchema;
Expand Down
20 changes: 10 additions & 10 deletions src/structures/IHttpLlmApplication.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { ILlmSchema } from "./ILlmSchema";
/**
* Application of LLM function call from OpenAPI document.
*
* `IHttpLlmApplication` is a data structure representing collection of
* `IHttpLlmApplication` is a data structure representing a collection of
* {@link IHttpLlmFunction LLM function calling schemas} composed from the
* {@link OpenApi.IDocument OpenAPI document} and its {@link OpenApi.IOperation operation}
* metadata. It also contains {@link IHttpLlmApplication.errors failed operations}, and
Expand Down Expand Up @@ -47,9 +47,9 @@ import { ILlmSchema } from "./ILlmSchema";
* ```
*
* By the way, there can be some parameters (or their nested properties) which must be
* composed by human, not by LLM. File uploading feature or some sensitive information
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* function parameters to both LLM and Human sides by configuring the
* {@link IHttpLlmApplication.IOptions.separate} property. The separated parameters are
* assigned to the {@link IHttpLlmFunction.separated} property.
*
Expand All @@ -60,7 +60,7 @@ import { ILlmSchema } from "./ILlmSchema";
* conversation based on the return value.
*
* Additionally, if you've configured {@link IHttpLlmApplication.IOptions.separate},
* so that the parameters are separated to human and LLM sides, you can merge these
* so that the parameters are separated to Human and LLM sides, you can merge these
* humand and LLM sides' parameters into one through {@link HttpLlm.mergeParameters}
* before the actual LLM function call execution.
*
Expand Down Expand Up @@ -182,27 +182,27 @@ export namespace IHttpLlmApplication {
* Separator function for the parameters.
*
* When composing parameter arguments through LLM function call,
* there can be a case that some parameters must be composed by human,
* there can be a case that some parameters must be composed by Human,
* or LLM cannot understand the parameter. For example, if the
* parameter type has configured
* {@link ILlmSchema.IString.contentMediaType} which indicates file
* uploading, it must be composed by human, not by LLM
* uploading, it must be composed by Human, not by LLM
* (Large Language Model).
*
* In that case, if you configure this property with a function that
* predicating whether the schema value must be composed by human or
* predicating whether the schema value must be composed by Human or
* not, the parameters would be separated into two parts.
*
* - {@link IHttpLlmFunction.separated.llm}
* - {@link IHttpLlmFunction.separated.human}
* - {@link IHttpLlmFunction.separated.Human}
*
* When writing the function, note that returning value `true` means
* to be a human composing the value, and `false` means to LLM
* to be a Human composing the value, and `false` means to LLM
* composing the value. Also, when predicating the schema, it would
* better to utilize the {@link LlmTypeChecker} features.
*
* @param schema Schema to be separated.
* @returns Whether the schema value must be composed by human or not.
* @returns Whether the schema value must be composed by Human or not.
* @default null
*/
separate: null | ((schema: Schema) => boolean);
Expand Down
30 changes: 30 additions & 0 deletions src/structures/ILlmApplication.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,36 @@
import { ILlmFunction } from "./ILlmFunction";
import { ILlmSchema } from "./ILlmSchema";

/**
* Application of LLM function calling.
*
* `ILlmApplication` is a data structure representing a collection of
* {@link ILlmFunction LLM function calling schemas}, composed from a native
* TypeScript class (or interface) type by the `typia.llm.application<App>()`
* function.
*
* By the way, the LLM function calling application composition, converting
* `ILlmApplication` instance from TypeScript interface (or class) type is not always
* successful. As LLM provider like OpenAI cannot understand the recursive reference
* type that is embodied by {@link OpenApi.IJsonSchema.IReference}, if there're some
* recursive types in the TypeScript interface (or class) type, the conversion would
* be failed.
*
* Also, there can be some parameters (or their nested properties) which must be
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* {@link ILlmApplication.IOptions.separate} property. The separated parameters are
* assigned to the {@link ILlmFunction.separated} property.
*
* For reference, when both LLM and Human filled parameter values to call, you can
* merge them by calling the {@link HttpLlm.mergeParameters} function. In other words,
* if you've configured the {@link ILlmApplication.IOptions.separate} property, you
* have to merge the separated parameters before the funtion call execution.
*
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export interface ILlmApplication<Schema extends ILlmSchema = ILlmSchema> {
/**
* List of function metadata.
Expand Down

0 comments on commit 6b55172

Please sign in to comment.