react-native-gpt-streamhandler

1.2.9 • Public • Published

React Native GPT Stream Handler

This is a helper library for React Native developers who want to build applications using the OpenAI API. The main aim is to streamline the process of handling Server-Sent Events (SSE) stream from the OpenAI API, mimicking a real-time chat environment.

The package exposes two main components: ChatHandler and SingleMessageHandler, which facilitate multi-turn conversation and single message handling respectively.

Jump to Section


Installation

Use the package manager npm to install:

npm install react-native-gpt-streamhandler

Usage

ChatHandler.js

ChatHandler.js is the main component to handle multi-turn conversations with the OpenAI API.

import { ChatHandler } from 'react-native-gpt-streamhandler';

const [messages, setMessages] = useState([{role: 'user', content: 'Whats the meaning of life?}]);
const [isTyping, setIsTyping] = useState(false);

// Sample usage

ChatHandler({
  messages: messages,
  setMessages: setMessages,
  apiKey: 'sk_xxxxxxxxxx,'
  statusCallback: setIsTyping,
  model: GPTMODELS.GPT3_5_TURBO,
});

Parameters

  • messages: An array of message objects with the structure: {role: 'user' | 'assistant', content: 'string'}.
  • setMessages: Function to update the state of the messages.
  • apiKey: Your OpenAI API Key (without the 'Bearer' prefix).
  • statusCallback: Optional. Function to update the status of the chat (true when the chat starts, false when it ends).
  • model: Optional. Model to use for the chat (default is GPT-3.5-turbo). Refer to the GPTMODELS in constants.js for all available options.
  • url: Optional. URL of the API endpoint (default is OpenAI chat endpoint).

Response

The response will be added to the messages Array like this:

[
	{
		role: 'user', 
		content: 'Whats the meaning of life?'
	},
	{
		role: "assistant", 
		content: 'As an AI Model...
	}
]

SingleMessageHandler.js

SingleMessageHandler.js is a component to handle a single message with the OpenAI API. You get a single response as a String back.

import { SingleMessageHandler } from 'react-native-openai-stream-handler';

const [response, setResponse] = useState("");

// Sample usage
SingleMessageHandler({
  inputMessage: 'Whats the meaning of life?',
  setResponse: setResponse,
  apiKey: 'sk_xxxxxxxxxx,'
  statusCallback: getIsTyping,
  model: GPTMODELS.GPT3_5_TURBO,
});

useEffect(() => {
	//get realtime response
	console.log(response)
}, [response]);

Parameters

  • inputMessage: The input message to send to the API.
  • setResponse: Function to update the state of the response in realtime.
  • apiKey: Your OpenAI API Key (without the 'Bearer' prefix).
  • statusCallback: Optional. Function to update the status of the chat (true when the chat starts, false when it ends).
  • model: Optional. Model to use for the chat (default is GPT-3.5-turbo). Refer to the GPTMODELS in Constants for all available options.
  • url: Optional. URL of the API endpoint (default is OpenAI chat endpoint).

Response

The setResponse function will be updated in realtime so that the response state will be changed for every token.

Constants

export const GPTMODELS = {  
GPT3_5_TURBO: 'gpt-3.5-turbo',  
GPT3_5_TURBO_0613: 'gpt-3.5-turbo-0613',  
GPT3_5_TURBO_16K: 'gpt-3.5-turbo-16k',  
GPT3_5_TURBO_16K_0613: 'gpt-3.5-turbo-16k-0613',  
GPT4: 'gpt-4',  
GPT4_0613: 'gpt-4-0613',  
GPT4_32K: 'gpt-4-32k',  
GPT4_32K_0613: 'gpt-4-32k-0613'  
};

GPTMODELS is an object that provides keys to reference different versions of the GPT-3, GPT-3.5-turbo, and GPT-4 models. URL is a constant string which points to the OpenAI chat API endpoint.

Contribution

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

MIT

Package Sidebar

Install

npm i react-native-gpt-streamhandler

Weekly Downloads

16

Version

1.2.9

License

MIT

Unpacked Size

12.2 kB

Total Files

8

Last publish

Collaborators

  • mahmoud_ali