react-native-gen-ui
TypeScript icon, indicating that this package has built-in type declarations

1.4.0 • Public • Published

React Native Generative UI Library

Inspired by Vercel's Generative UI for React Server Components.

Offers a seamless integration of OpenAI's advanced AI capabilities within React Native applications. Library provides components and helpers for building AI-powered streaming text and chat UIs.

Example Gif

If you are interested in learning more, take a look at this blog post. It discusses how to build UIs with this package and what is behind the scenes.

Features

  • React Native (with Expo) type-safe helpers for streaming text responses + components for building chat UIs
  • First-class support for Function calling with component support that LLM decides to render for interactive user interfaces
  • Easy UI implementation with powerful useChat hook
  • Support for OpenAI models
  • Streaming responses (only streaming is supported ATM).
  • Supports OpenAI's Chat completions API.

Installation 🚀

It's easy to get started - just install package with your favorite package manager:

Yarn

yarn add react-native-gen-ui

NPM

npm install react-native-gen-ui

Basic usage 🎉

Import

To get started, import useChat hook in any React component:

import { OpenAI, isReactElement, useChat } from 'react-native-gen-ui';

Initialize the OpenAI instance

const openAi = new OpenAI({
  apiKey: process.env.EXPO_PUBLIC_OPENAI_API_KEY!,
  model: 'gpt-4',
  // You can even set a custom basePath of your SSE server
});

Ensure you have the OpenAI API key and the desired model environment variables set up in your project. These are stored as environment variables (in Expo):

EXPO_PUBLIC_OPENAI_API_KEY=sk....           # Required, you can get one in OpenAi dashboard
EXPO_PUBLIC_OPENAI_MODEL=model_name_here    # Optional, model name from OpenAI (defaults to 'gpt-4')

🚨 Note: This kind of implementation where you access OpenAI directly from the client device exposes your OpenAI API key to the public. The documentation here is just an example, for production use make sure to point basePath to your proxy server that forwards server sent events from OpenAI back to the client.

Use the hook

Initialize the useChat hook inside your component. You can optionally pass initial messages, success and error handlers, and any tools the model will have access to.

const { input, onInputChange, messages, isLoading, handleSubmit } = useChat({
  openAi,
  // Optional initial messages
  initialMessages: [
    { content: 'Hi! How can I help you today?', role: 'system' },
  ],
  // Optional success handler
  onSuccess: (messages) => console.log('Chat success:', messages),
  // Optional error handler
  onError: (error) => console.error('Chat error:', error),
});

Create the UI for your chat interface that includes input, submit button and a view to display the chat messages.

return (
  <View>
    {messages.map((msg, index) => {
      // Message can be react component or string (see function calling section for more details)
      if (isReactElement(msg)) {
        return msg;
      }
      switch (msg.role) {
        case 'user':
          return (
            <Text
              style={{
                color: 'blue',
              }}
              key={index}>
              {msg.content?.toString()}
            </Text>
          );
        case 'assistant':
          return <Text key={index}>{msg.content?.toString()}</Text>;
        default:
          // This includes tool calls, tool results and system messages
          // Those are visible to the model, but here we hide them to the user
          return null;
      }
    })}
    <TextInput value={input} onChangeText={onInputChange} />
    <Button
      onPress={() => handleSubmit(input)}
      title="Send"
      disabled={isLoading}
    />
  </View>
);

Ensure you pass the input state to the TextInput component, onInputChange to handle text changes, and handleSubmit for sending messages.

Congrats 🎉 you successfully implemented chat using OpenAI model!

Function calling (Tools) 🔧

The useChat hook supports the integration of Tools, a powerful feature allowing you to incorporate custom functions or external API calls directly into your chat flow.

Defining a Tool

Tools are defined as part of the tools parameter when initializing the useChat hook. Parameters are validated using zod schema. Below is example of weather forecast defined as tool:

const { input, messages, isLoading, handleSubmit, onInputChange } = useChat({
     ...
     tools: {
      getWeather: {
        description: "Get weather for a location",
        // Validate tool parameters using zod
        parameters: z.object({
          // In this case, tool accepts date and location for weather
          date: z.date().default(() => new Date()),
          location: z.string(),
        }),
        // Render component for weather - can yield loading state
        render: async function* (args) {
          // With 'yield' we can show loading  while fetching weather data
          yield <Spinner />;

          // Call API for current weather
          const weatherData = await fetchWeatherData(args.location);

          // We can yield again to replace the loading component at any time.
          // This can be useful for showing progress or intermediate states.
          yield <Loading />

          // Return the final result
          return {
            // The data will be seen by the model
            data: weatherData,
            // The component will be rendered to the user
            component: (
              <Weather
                location={args.location}
                current={weatherData[0]}
                forecast={weatherData}
              />
            ),
          };
        }
      }
    }
});

If tool doesn't need to do any async operations as you expect the AI model to return all the required data to render the component, you can use a simple function that returns the data and the component:

tools: {
  joke: {
    description:
      "Call this tool with an original joke setup and punchline.",
    parameters: z.object({
      setup: z.string(),
      punchline: z.string(),
    }),
    // Render component for joke and pass data also back to the model.
    render: (data) => ({
      data,
      component: <JokeCard data={data} />,
    }),
  },
},

Tools framework within useChat is highly extensible. You can define multiple tools to perform various functions based on your chat application's requirements.

Reference

const {
  input, // State of user input (i.e. in TextInput component)
  messages, // List of all messages for current chat session
  error, // Error that can occur during streaming
  isLoading, // Loading state - true immediately after user message submission
  isStreaming, // Streaming state - true while streaming response
  onInputChange, // Updates internal state of user input
  handleSubmit, // Handles user message submission
} = useChat({
  openAi: OpenAI, // OpenAI instance (imported from 'react-native-gen-ui')
  initialMessages: [], // Initial messages chat messages
  onSuccess: () => {...}, // Called when streaming response is completed
  onError: (error) => {...}, // Called when an error occurs while streaming
  tools: ... // Tools for custom API calls or functions
});

Examples

License

Published under MIT License, more details at LICENSE file.

Package Sidebar

Install

npm i react-native-gen-ui

Weekly Downloads

61

Version

1.4.0

License

MIT

Unpacked Size

39.3 kB

Total Files

17

Last publish

Collaborators

  • devmatej
  • zigapk
  • zerodays-dev