npm

@motorro/firebase-ai-chat-openai
TypeScript icon, indicating that this package has built-in type declarations

8.6.5 • Public • Published

Firebase OpenAI chat library

Check npm version

OpenAI chat library. See top-level documentation for complete reference.

OpenAI setup

We also need to set up the OpenAI API. To do this we need to get the OpenAI API key and to define the used assistant ID:

const region = "europe-west1";
const openAiApiKey = defineSecret("OPENAI_API_KEY");
const openAiAssistantId = defineString("OPENAI_ASSISTANT_ID");

const options: CallableOptions = {
  secrets: [openAiApiKey],
  region: region,
  invoker: "public"
};

Optional custom message mapper

By default, the library uses only text messages as-is. But if you want custom message processing, say image processing or adding some metadata, you could create your own AI message processor and supply it to chat factory worker method. Default message mapper could be found here.

const myMessageMapper: OpenAiMessageMapper = {
    toAi(message: NewMessage): UserMessageParts {
        throw new Error("TODO: Implement mapping from Chat to OpenAI");
    },

    fromAi(message: Message): NewMessage | undefined {
        throw new Error("TODO: Implement OpenAI to Chat message mapping")
    }
}

Custom resource cleaner

When you close the chat, the library cleans up the threads that were created during the chat. If you need any custom processing, you may add some custom cleaner that will be called along:

/**
 * Chat resource cleaner
 */
const cleaner: ChatCleaner = {
    /**
     * Schedules cleanup commands stored inside chat data
     * @param chatDocumentPath Chat document
     */
    cleanup: async (chatDocumentPath: string): Promise<void> => {
        logger.d("Cleanup");
    }
}

Optional middleware

By default, the library saves all the messages that come from AI. If you need any custom processing, you could add some custom AI message middleware. Take a look at the main documentation for details here.

const handOver: MessageMiddleware<CalculateChatData, CalculatorMeta> = chatFactory.handOverMiddleware(
    "calculator",
    handOverProcessor
);

Refer to Configure your environment article for more information on setting environment and secret variables for your functions.

Command dispatcher configuration

The requests to front-facing functions return to user as fast as possible after changing the chat state in Firestore. As soon as the AI run could take a considerable time, we run theme in a Cloud Task "offline" from the client request. To execute the Assistant run we use the second class from the library - the OpenAiChatWorker. To create it, use the AiChat factory we created as described in the main documentation.

To register the Cloud Task handler you may want to create the following function:

import {onTaskDispatched} from "firebase-functions/v2/tasks";
import OpenAI from "openai";
import {OpenAiWrapper, Meta} from "@motorro/firebase-ai-chat-openai";

export const calculator = onTaskDispatched(
    {
      secrets: [openAiApiKey],
      retryConfig: {
        maxAttempts: 1,
        minBackoffSeconds: 30
      },
      rateLimits: {
        maxConcurrentDispatches: 6
      },
      region: region
    },
    async (req) => {
      // Create and run a worker
      // See the `dispatchers` definitions below
      await chatFactory.worker(
          new OpenAI({apiKey: openAiApiKey.value()}), 
          dispatchers, 
          myMessageMapper, 
          cleaner, 
          [handOver]
      ).dispatch(
          req,
          (chatDocumentPath: string, meta: Meta) => {
             // Optional task completion handler
             // Meta - some meta-data passed to chat operation
          }   
      );
    }
);

The OpenAiChatWorker handles the OpenAiChatCommand and updates OpenAiChatState with the results.

Full example is available in the sample Firebase project.

Readme

Keywords

Package Sidebar

Install

npm i @motorro/firebase-ai-chat-openai

Weekly Downloads

10

Version

8.6.5

License

MIT

Unpacked Size

160 kB

Total Files

92

Last publish

Collaborators

  • motorro