Application of LLM function call from OpenAPI document.

IHttpLlmApplication is a data structure representing a collection of LLM function calling schemas composed from the OpenAPI document and its operation metadata. It also contains failed operations, and adjusted options during the IHttpLlmApplication construction.

About the API operations, they are converted to IHttpLlmFunction type which represents LLM function calling schema. By the way, if there're some types which does not supported by LLM, the operation would be failed and pushed into the IHttpLlmApplication.errors. Otherwise not, the operation would be successfully converted to IHttpLlmFunction and its type schemas are downgraded to OpenApiV3.IJsonSchema and converted to ILlmSchemaV3.

For reference, the arguments type is composed by below rule.

{
...pathParameters,
query,
body,
}

By the way, there can be some parameters (or their nested properties) which must be composed by Human, not by LLM. File uploading feature or some sensitive information like secret key (password) are the examples. In that case, you can separate the function parameters to both LLM and Human sides by configuring the IHttpLlmApplication.IOptions.separate property. The separated parameters are assigned to the IHttpLlmFunction.separated property.

For reference, the actual function call execution is not by LLM, but by you. When the LLM selects the proper function and fills the arguments, you just call the function by HttpLlm.execute with the LLM prepared arguments. And then informs the return value to the LLM by system prompt. The LLM will continue the next conversation based on the return value.

Additionally, if you've configured IHttpLlmApplication.IOptions.separate, so that the parameters are separated to Human and LLM sides, you can merge these humand and LLM sides' parameters into one through HttpLlm.mergeParameters before the actual LLM function call execution.

Jeongho Nam - https://github.com/samchon

interface IHttpLlmApplication<Model extends Model> {
    errors: IHttpLlmApplication.IError[];
    functions: IHttpLlmFunction<Model>[];
    model: Model;
    options: IHttpLlmApplication.IOptions<Model>;
}

Type Parameters

Properties

List of errors occurred during the composition.

functions: IHttpLlmFunction<Model>[]

List of function metadata.

List of function metadata that can be used for the LLM function call.

When you want to execute the function with LLM constructed arguments, you can do it through LlmFetcher.execute function.

model: Model

Model of the target LLM.

Configuration for the application.