Below is the documentation for integrating Reconify with OpenAI Assistant API via NPM or REST API
The first step is to create an account at app.reconify.com.
In the Reconify console, add an Application to your account. This will generate both an API_KEY and an APP_KEY which will be used in the code below to send data to Reconify.
The easiest way to integrate in Node.js is with the NPM module.
npm install reconify --save
import {reconifyOpenAIHandler} from 'reconify';
Prior to initializing the Reconify module, make sure to import and initialize the OpenAI module.
import { OpenAI } from "openai";
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY
});
Initialize the module passing in the openai object and the keys created above.
const reconify = reconifyOpenAIHandler(openai, {
appKey: process.env.RECONIFY_APP_KEY,
apiKey: process.env.RECONIFY_API_KEY,
})
You can optionally turn on "debug" mode by passing in "debug: true" in the JSON above. This will print debug messages to the console.
const reconify = reconifyOpenAIHandler(openai, {
appKey: process.env.RECONIFY_APP_KEY,
apiKey: process.env.RECONIFY_API_KEY,
debug: true,
})
To log the interaction, send the Run object, input Message object, and output Message object to Reconify.
If using streaming, this can be done in the on('event') handler. See below.
//create thread
const thread = await openai.beta.threads.create();
//create message
const messageIn = await openai.beta.threads.messages.create(
thread.id,
{
role: "user",
content: "What are the store hours?"
}
);
//streaming run
const run = openai.beta.threads.run.stream(
thread.id,
{assistant_id: process.env.OPENAI_ASSISTANT_ID}
)
.on('textDelta', (textDelta, snapshot) => {
process.stdout.write(textDelta.value)
})
.on('event', async(params) => {
const (event, data) = params
if(event == 'thread.run.completed') {
//get last message from the thread - the output message
//get the thread
const threadList = await openai.beta.threads.messages.list(
data.thread_id,
{
limit: 1,
run_id: data.id
})
//get the messages
let messages = threadList?.data
//get last message
let messageOut = null
if (Array.isArray(messages)) {
messageOut = messages[0]
}
//log interaction to Reconify
reconify.logAssistant(data, messageIn, messageOut)
}
});
To log the interaction, send the Run object, input Message object, and output Message object to Reconify.
If using streaming, this can be done in the on('event') handler. See below.
//create thread
const thread = await openai.beta.threads.create();
//create message
const messageIn = await openai.beta.threads.messages.create(
thread.id,
{
role: "user",
content: "What are the store hours?"
}
);
//run without streaming
const run = await openai.beta.threads.run.createAndPoll(
thread.id,
{assistant_id: process.env.OPENAI_ASSISTANT_ID}
)
//check status
if(run.status == 'completed') {
//get last message from the thread - the output message
//get the thread
const threadList = await openai.beta.threads.messages.list(
data.thread_id,
{
limit: 1,
run_id: data.id
})
//get the messages
let messages = threadList?.data
//get last message
let messageOut = null
if (Array.isArray(messages)) {
messageOut = messages[0]
}
//log interaction to Reconify
reconify.logAssistant(data, messageIn, messageOut)
}
You can optionally pass in a user object or session ID to be used in the analytics reporting. The session ID will be used to group interactions together in the same session transcript.
The user object should include a unique userId, the other fields are optional.
reconify.setUser ({
"userId": "123",
"isAuthenticated": 1,
"firstName": "Francis",
"lastName": "Smith",
"email": "",
"phone": "",
"gender": "female"
});
The session ID can be used to group interactions together in a thread. By default, the Assistant thread id will be used to group the interactions. You can set a session ID to override this behavior with your own session id.
reconify.setSession('MySessionId');
Below are instructions for integrating with the REST API
At a high level, you will send a copy of the same request and response messages to Reconify.
Sign up for an account to get the endpoint url
{
"reconify": {},
"timestamps": {},
"session": "",
"sessionTimeOut": null,
"user": {},
"request": {},
"response": {}
}
{
"reconify": {
"format": "openai",
"type": "assistant",
"version": "3.1.0",
"appKey": process.env.RECONIFY_APP_KEY,
"apiKey": process.env.RECONIFY_API_KEY
}
}
Timestamps are in millisecond format. If timestamps are not passed in, the the time the API receives the payload will be used.
{
"timestamps": {
"request": 1725321819000,
"response": 1725321821000
}
}
Both session parameters are optional. By default, the thread id of the Assistant will be used to group messages in a thread. The session id and timeout are to override this behavior.
{
"session": "ABCDEF123456",
"sessionTimeout": 10
}
The user object is optional
Either "userId" or "email" is required if a user object is sent to track unique users, otherwise users will all be new
{
"user: {
"userId": "ABC123",
"isAuthenticated": 1,
"firstName": "Jane",
"lastName": "Doe",
"email": "some_email",
"phone": "555-555-5555",
"gender": "female"
}
}
"request": {
"id": "msg_n7aClsT6jpI21Z5AZ2GjPTuX",
"object": "thread.message",
"created_at": 1725321818,
"assistant_id": process.env.OPENAI_ASSISTANT_ID,
"thread_id": "thread_KPMPPkxgHAIbYkkH8nREZyWE",
"run_id": null,
"role": "user",
"content": [
{
"type": "text",
"text": {
"value": "What are the store hours?",
"annotations": []
}
}
],
"attachments": [],
"metadata": {},
"model": "gpt-4o-mini"
},
"response": {
"id": "msg_XLH7LqknArodE3mp9e2tiGYx",
"object": "thread.message",
"created_at": 1725321820,
"assistant_id": process.env.OPENAI_ASSISTANT_ID,
"thread_id": "thread_KPMPPkxgHAIbYkkH8nREZyWE",
"run_id": "run_jfYXmD7ZwIItSE1RPibetfXS",
"role": "assistant",
"content": [
{
"type": "text",
"text": {
"value": "Our store is open Monday through Friday from 9:00am to 5:00pm.",
"annotations": []
}
}
],
"attachments": [],
"metadata": {},
"usage": {
"prompt_tokens": 557,
"completion_tokens": 22,
"total_tokens": 579
}
}