Documentation for OpenAI Assistant API

Below is the documentation for integrating Reconify with OpenAI Assistant API via NPM or REST API

Get started - Create an account

The first step is to create an account at app.reconify.com.

Generate API and APP Keys

In the Reconify console, add an Application to your account. This will generate both an API_KEY and an APP_KEY which will be used in the code below to send data to Reconify.

Node Integration

The easiest way to integrate in Node.js is with the NPM module.

Install the NPM module

npm install reconify --save

Import the module

import {reconifyOpenAIHandler} from 'reconify';

Initialize the module

Prior to initializing the Reconify module, make sure to import and initialize the OpenAI module.

import { OpenAI } from "openai";
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
});

Initialize the module passing in the openai object and the keys created above.

const reconify = reconifyOpenAIHandler(openai, {
appKey: process.env.RECONIFY_APP_KEY,
apiKey: process.env.RECONIFY_API_KEY,
})

Optional initialization parameters

You can optionally turn on "debug" mode by passing in "debug: true" in the JSON above. This will print debug messages to the console.

const reconify = reconifyOpenAIHandler(openai, {
appKey: process.env.RECONIFY_APP_KEY,
apiKey: process.env.RECONIFY_API_KEY,
debug: true,
})

Log interaction with Streaming

To log the interaction, send the Run object, input Message object, and output Message object to Reconify.

If using streaming, this can be done in the on('event') handler. See below.

//create thread
const thread = await openai.beta.threads.create();

//create message
const messageIn = await openai.beta.threads.messages.create(
  thread.id,
  {
    role: "user",
    content: "What are the store hours?"
  }
);

//streaming run
const run = openai.beta.threads.run.stream(
  thread.id,
  {assistant_id: process.env.OPENAI_ASSISTANT_ID}
)
  .on('textDelta', (textDelta, snapshot) => {
    process.stdout.write(textDelta.value)
  })
  .on('event', async(params) => {
    const (event, data) = params
    if(event == 'thread.run.completed') {

      //get last message from the thread - the output message
   
      //get the thread
      const threadList = await openai.beta.threads.messages.list(
        data.thread_id,
        {
          limit: 1,
          run_id: data.id
        })

      //get the messages
      let messages = threadList?.data
      //get last message
      let messageOut = null
      if (Array.isArray(messages)) {
        messageOut = messages[0]
      }
      //log interaction to Reconify
      reconify.logAssistant(data, messageIn, messageOut)
    }
  });
    

Log Interaction without Streaming

To log the interaction, send the Run object, input Message object, and output Message object to Reconify.

If using streaming, this can be done in the on('event') handler. See below.

//create thread
const thread = await openai.beta.threads.create();

//create message
const messageIn = await openai.beta.threads.messages.create(
  thread.id,
  {
    role: "user",
    content: "What are the store hours?"
  }
);

//run without streaming
const run = await openai.beta.threads.run.createAndPoll(
  thread.id,
  {assistant_id: process.env.OPENAI_ASSISTANT_ID}
)

//check status
if(run.status == 'completed') {
  //get last message from the thread - the output message
   
  //get the thread
  const threadList = await openai.beta.threads.messages.list(
    data.thread_id,
    {
      limit: 1,
      run_id: data.id
    })

  //get the messages
  let messages = threadList?.data
  //get last message
  let messageOut = null
  if (Array.isArray(messages)) {
    messageOut = messages[0]
  }
  //log interaction to Reconify
  reconify.logAssistant(data, messageIn, messageOut)
}
    

Optional methods

You can optionally pass in a user object or session ID to be used in the analytics reporting. The session ID will be used to group interactions together in the same session transcript.

The user object should include a unique userId, the other fields are optional.

reconify.setUser ({
  "userId": "123",
  "isAuthenticated": 1,
  "firstName": "Francis",
  "lastName": "Smith",
  "email": "",
  "phone": "",
  "gender": "female"
});

The session ID can be used to group interactions together in a thread. By default, the Assistant thread id will be used to group the interactions. You can set a session ID to override this behavior with your own session id.

reconify.setSession('MySessionId');

REST API Integration

Below are instructions for integrating with the REST API

At a high level, you will send a copy of the same request and response messages to Reconify.

Endpoint

Sign up for an account to get the endpoint url

JSON Payload Structure

{
"reconify": {},
"timestamps": {},
"session": "",
"sessionTimeOut": null,
"user": {},
"request": {},
"response": {}
}

Reconify Parameters (required)

{
"reconify": {
 "format": "openai",
 "type": "assistant",
 "version": "3.1.0",
 "appKey": process.env.RECONIFY_APP_KEY,
 "apiKey": process.env.RECONIFY_API_KEY
}
}

  • "format" is required and should be set to "openai"
  • "type" must be "assistant"
  • "version" is the version of the Reconify REST API
  • "appKey" and "apiKey" are the Reconify values generated when creating the app

Timestamp Parameters

Timestamps are in millisecond format. If timestamps are not passed in, the the time the API receives the payload will be used.

{
"timestamps": {
 "request": 1725321819000,
 "response": 1725321821000
}
}

Session Parameters (optional)

Both session parameters are optional. By default, the thread id of the Assistant will be used to group messages in a thread. The session id and timeout are to override this behavior.

  • session is an alphanumeric string to indicate
  • sessionTimout is the lenght of time in minutes. to wait for an interaction, before ending the current session. The default is 10.

{
"session": "ABCDEF123456",
"sessionTimeout": 10
}

User Parameters (optional)

The user object is optional
Either "userId" or "email" is required if a user object is sent to track unique users, otherwise users will all be new

  • "userId" is an alphanumeric string unique for the user
  • "isAuthenticated" is optional and can be 1 or 0 to track logged in users
  • "firstName" and "lastName" are optional alphanumeric strings
  • "email" is an optional alphanumeric string, it is used to identify a user if a userId is not present
  • "phone" is optional
  • "gender" is optional

{
"user: {
 "userId": "ABC123",
 "isAuthenticated": 1,
 "firstName": "Jane",
 "lastName": "Doe",
 "email": "some_email",
 "phone": "555-555-5555",
 "gender": "female"
}
}

Request Parameters (required)

"request": {
       "id": "msg_n7aClsT6jpI21Z5AZ2GjPTuX",
       "object": "thread.message",
       "created_at": 1725321818,
       "assistant_id": process.env.OPENAI_ASSISTANT_ID,
       "thread_id": "thread_KPMPPkxgHAIbYkkH8nREZyWE",
       "run_id": null,
       "role": "user",
       "content": [
           {
               "type": "text",
               "text": {
                   "value": "What are the store hours?",
                   "annotations": []
               }
           }
       ],
       "attachments": [],
       "metadata": {},
       "model": "gpt-4o-mini"
   },

Response Parameters (required)

"response": {
       "id": "msg_XLH7LqknArodE3mp9e2tiGYx",
       "object": "thread.message",
       "created_at": 1725321820,
       "assistant_id": process.env.OPENAI_ASSISTANT_ID,
       "thread_id": "thread_KPMPPkxgHAIbYkkH8nREZyWE",
       "run_id": "run_jfYXmD7ZwIItSE1RPibetfXS",
       "role": "assistant",
       "content": [
           {
               "type": "text",
               "text": {
                   "value": "Our store is open Monday through Friday from 9:00am to 5:00pm.",
                   "annotations": []
               }
           }
       ],
       "attachments": [],
       "metadata": {},
       "usage": {
           "prompt_tokens": 557,
           "completion_tokens": 22,
           "total_tokens": 579
       }
   }