đź“– Guide Documents
Generators
OpenAI Function Call

Outline

Terminal
# BUILD OPENAI DOCUMENT ONLY
npx nestia openai
 
# BUILD OPENAI/SWAGGER/SDK/E2E AT THE SAME TIME 
npx nestia all

Configure nestia.config.ts file and run npx nestia openai command.

Then, @nestia/openai will analyze your NestJS backend server code, and generate openai.json file. The openai.json file contains the information of OpenAI function calling schemas (opens in a new tab), so that you can develop OpenAI function calling agent service with it.

For reference, the @nestia/sdk generated OpenAI function calling schemas are following the type definitions of @wrtnio/openai-function-schema (opens in a new tab) package. As it provides the function call executor and schema separator to LLM (Large Language Model) and human parts, you can easily develop the OpenAI function calling agent service with @nestia/sdk and @wrtnio/openai-function-schema (opens in a new tab).

Configuration

Application Module

nestia.config.ts
import { INestiaConfig } from "@nestia/sdk";
import { NestFactory } from "@nestjs/core";
// import { FastifyAdapter } from "@nestjs/platform-fastify";
 
import { YourModule } from "./src/YourModule";
 
const NESTIA_CONFIG: INestiaConfig = {
  input: async () => {
    const app = await NestFactory.create(YourModule);
    // const app = await NestFactory.create(YourModule, new FastifyAdapter());
    // app.setGlobalPrefix("api");
    // app.enableVersioning({
    //     type: VersioningType.URI,
    //     prefix: "v",
    // })
    return app;
  },
  openai: {
    output: "dist/openai.json",
    beautify: true,
    keyword: true,
  },
};
export default NESTIA_CONFIG;

Make nestia.config.ts file and run npx nestia openai command.

At first, create nestia.config.ts file through npx nestia init command. It would be placed on the top level directory of your NestJS backend project. For reference, tsconfig.json file also must be placed in the top level directory, too. After creation, configure the nestia.config.ts file referencing above example code and type definition.

At least, you've to configure those two properties:

  • input: Accessor of controller classes
  • openai.output: Path of openai.json file

When you've completed above configuration, just run npx nestia swagger command. Then, openai.json file would be newly generated, and placed into the $config.swagger.output directory following your nestia.config.ts configuration.

Multiple Files Generation

nestia.config.ts
import { INestiaConfig } from "@nestia/sdk";
import { NestFactory } from "@nestjs/core";
 
import { AppModule } from "./src/modules/AppModule";
import { BbsModule } from "./src/modules/BbsModule";
import { CommonModule } from "./src/modules/CommonModule";
 
export const NESTIA_CONFIGURATIONS: INestiaConfig[] = [
  {
    input: () => NestFactory.create(AppModule),
    openai: {
      output: "openai.json",
      keyword: true,
    },
  },
  {
    input: () => NestFactory.create(BbsModule),
    openai: {
      output: "bbs.openai.json",
      keyword: true,
    },
  },
  {
    input: () => NestFactory.create(CommonModule),
    openai: {
      output: "common.openai.json",
      keyword: false,
    },
  },
];
export default NESTIA_CONFIGURATIONS;

You can build multiple OpenAI function calling schema files.

Just configure an array of INestiaConfig instances like above example code.

Then, @nestia/sdk will generate multiple OpenAI function calling schema files following the configurations.

Additional Properties

  • openai.keyword: Whether the parameters are keyworded or not.
  • openai.separate: Separator function of the parameters for human and LLM.
  • openai.beautify: Whether to beautify JSON content or not.
See detailed options:

INestiaConfig.ts
export namespace INestiaConfig {
  /**
   * Configuration for the OpenAI funtion call schema generation.
   */
  export interface IOpenAiConnfig {
    /**
     * Output path of the `openai.json`.
     *
     * If you've configured only directory, the file name would be the `openai.json`.
     * Otherwise you've configured the full path with file name and extension, the
     * `openai.json` file would be renamed to it.
     */
    output: string;
 
    /**
     * Whether the parameters are keyworded or not.
     *
     * If this property value is `true`, length of the
     * {@link IOpenAiDocument.IFunction.parameters} is always 1, and type of the
     * pararameter is always {@link IOpenAiSchema.IObject} type. Also, its
     * properties are following below rules:
     *
     * - `pathParameters`: Path parameters of {@link ISwaggerMigrateRoute.parameters}
     * - `query`: Query parameter of {@link ISwaggerMigrateRoute.query}
     * - `body`: Body parameter of {@link ISwaggerMigrateRoute.body}
     *
     * ```typescript
     * {
     *   ...pathParameters,
     *   query,
     *   body,
     * }
     * ```
     *
     * Otherwise (this property value is `false`), length of the
     * {@link IOpenAiDocument.IFunction.parameters} is variable, and sequence of the
     * parameters are following below rules.
     *
     * ```typescript
     * [
     *   ...pathParameters,
     *   ...(query ? [query] : []),
     *   ...(body ? [body] : []),
     * ]
     * ```
     *
     * @default false
     */
    keyword?: boolean;
 
    /**
     * Separator function for the parameters.
     *
     * When composing parameter arguments through OpenAI function call,
     * there can be a case that some parameters must be composed by human, or
     * LLM cannot understand the parameter. For example, if the parameter type
     * has configured {@link IOpenAiSchema.IString["x-wrtn-secret-key"]}, the
     * secret key value must be composed by human, not by LLM (Large Language Model).
     *
     * In that case, if you configure this property with a function that
     * predicating whether the schema value must be composed by human or not,
     * the parameters would be separated into two parts.
     *
     * - {@link IOpenAiFunction.separated.llm}
     * - {@link IOpenAiFunction.separated.human}
     *
     * When writing the function, note that returning value `true` means to be
     * a human composing the value, and `false` means to LLM composing the value.
     * Also, when predicating the schema, it would better to utilize the
     * {@link OpenAiTypeChecker} features.
     *
     * @param schema Schema to be separated.
     * @returns Whether the schema value must be composed by human or not.
     * @default null
     */
    separate?: null | ((schema: IOpenAiSchema) => boolean);
 
    /**
     * Whether to beautify JSON content or not.
     *
     * If you configure this property to be `true`, the `openai.json` file would
     * be beautified with indentation (2 spaces) and line breaks. If you configure
     * numeric value instead, the indentation would be specified by the number.
     *
     * @default false
     */
    beautify?: boolean | number;
  }
}

CLI Arguments

Terminal
npx nestia swagger
npx nestia swagger --config nestia2.config.ts
npx nestia swagger --project tsconfig2.json
npx nestia swagger --config nestia3.config.ts --project tsconfig3.tsconfig.json

If you have a special configuration file that its file name is not nestia.config.ts or the configuration file is not placed on the root directory of the project, you can specify it with --config option like npx nestia swagger --config another.config.ts.

Also, if you have a special tsconfig.json file or the project file is not located in the root directory of the project, you can specify it with --project argument like npx nestia swagger --project another.tsconfig.json, too.

Function Call Execution

Positional

test_fetcher_keyword_bbs_article_update.ts
import {
  IOpenAiDocument,
  IOpenAiFunction,
  OpenAiFetcher,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";
 
import { IBbsArticle } from "../../../api/structures/IBbsArticle";
 
const main = async (): Promise<void> => {
  // OPENAI FUNCTION CALL SCHEMAS
  const document: IOpenAiDocument = JSON.parse( 
    await fs.promises.readFile("openai.json", "utf8"),
  );
 
  // EXECUTE OPENAI FUNCTION CALL
  const func: IOpenAiFunction = document.functions.find(
    (f) => f.method === "put" && f.path === "/bbs/articles",
  )!;
  const article: IBbsArticle = await OpenAiFetcher.execute({
    document,
    function: func,
    connection: { host: "http://localhost:3000" },
    arguments: [
      // imagine that arguments are composed by OpenAI
      v4(),
      typia.random<IBbsArticle.ICreate>(),
    ],
  });
  typia.assert(article);
};
main().catch(console.error);

If you deliver openai.json file's function call schemas to the OpenAI SDK (opens in a new tab), the OpenAI chatting agent will choose proper functions during the chatting, and composes argument values automatically to induce the function call executions.

When the OpenAI SDK notifies you the target function to execute with argument values, just perform the function call executino by importing OpenAiFetcher module of @wrtnio/openai-function-schema (opens in a new tab) and calling the OpenAiFetcher.execute() method of it.

Above is the example code utilizin the OpenAiFetcher.execute() method.

Keyword

test_fetcher_positional_bbs_article_update.ts
import {
  IOpenAiDocument,
  IOpenAiFunction,
  OpenAiFetcher,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";
 
import { IBbsArticle } from "../../../api/structures/IBbsArticle";
 
const main = async (): Promise<void> => {
  // OPENAI FUNCTION CALL SCHEMAS
  const document: IOpenAiDocument = JSON.parse( 
    await fs.promises.readFile("openai.json", "utf8"),
  );
 
  // EXECUTE OPENAI FUNCTION CALL
  const func: IOpenAiFunction = document.functions.find(
    (f) => f.method === "put" && f.path === "/bbs/articles",
  )!;
  const article: IBbsArticle = await OpenAiFetcher.execute({
    document,
    function: func,
    connection: { host: "http://localhost:3000" },
    arguments: [
      // imagine that argument is composed by OpenAI
      {
        id: v4(),
        body: typia.random<IBbsArticle.ICreate>(),
      },
    ],
  });
  typia.assert(article);
};
main().catch(console.error);

If you've configured openai.keyword option of nestia.config.ts file to be true, every OpenAI function call schemas defined in the openai.json have only one parameter, and the parameter is always an object type.

This is the concept of "keyworded parameter", keeping only one parameter by groupping every parameters into an object. This "keyworded parameter" mode is recommended due to the OpenAI makes more suitable argument values in rather than the non-keyworded parameter (positional parameters) case.

Also, do not worry anything about re-construction of the parameters. The OpenAiFetcher.execute() method of @wrtnio/openai-function-schema (opens in a new tab) will automatically decompose the object parameter into the positional parameters, and perform the function call execution.

Just put the argument value composed from OpenAI, then OpenAiFetcher.execute() will do everything.

Separate

test_combiner_keyword_parameters_query.ts
import {
  IOpenAiDocument,
  IOpenAiFunction,
  IOpenAiSchema,
  OpenAiDataCombiner,
  OpenAiFetcher,
  OpenAiTypeChecker,
} from "@wrtnio/openai-function-schema";
import fs from "fs";
import typia from "typia";
 
import { IMembership } from "../../api/structures/IMembership";
 
const main = async (): Promise<void> => {
  // OPENAI FUNCTION CALL SCHEMAS
  const document: IOpenAiDocument = JSON.parse(
    await fs.promises.readFile("openai.json", "utf8"),
  );
 
  // EXECUTE OPENAI FUNCTION CALL
  const func: IOpenAiFunction = document.functions.find(
    (f) => f.method === "patch" && f.path === "/membership/change",
  )!;
  const membership: IMembership = await OpenAiFetcher.execute({
    document,
    function: func,
    connection: { host: "http://localhost:3000" },
    arguments: OpenAiDataCombiner.parameters({
      function: func,
      llm: [
        // imagine that below argument is composed by OpenAI
        {
          body: {
            name: "Wrtn Technologies",
            email: "master@wrtn.io",
            password: "1234",
            age: 20,
            gender: 1,
          },
        },
      ],
      human: [
        // imagine that below argument is composed by human
        {
          query: {
            secret: "something",
          },
          body: {
            secretKey: "something",
            picture: "https://wrtn.io/logo.png",
          },
        },
      ],
    }),
  });
  typia.assert(membership);
};
main().catch(console.error);

At last, there can be some special API operation that some arguments must be composed by user, not by LLM (Large Language Model). For example, if an API operation requires file uploading or secret key identifier, it must be composed by user manually in the frontend application side.

For such case, nestia.config.ts file supports special option openai.separate. If you configure the callback function, it would be utilized for determining whether the value must be composed by user or not. When the arguments are composed by both user and LLM sides, you can combine them into one through OpenAiDataComposer.parameters() method, so that you can still execute the function calling with OpenAiFetcher.execute() method.

Above is the example code of such special case.