Please note that 'Variables' are now called 'Fields' in Landbot's platform.
Artificial intelligence continues to evolve, and embracing all the opportunities it brings can definitely make our lives easier. As we’ll see in this blog article, an OpenAI chatbot integration, for example, can help you automate part of your bot-building process, offering a human-like interaction to your prospects and customers, thanks to GPT-4o by OpenAI, a natural language processing (NLP) model that employs deep learning to generate text that closely resembles human language.
Why is it so important?
It's an absolute game-changer for chatbot creation as in combination with the right visual builder like Landbot, it can make AI bot building accessible to anyone.
In this step-by-step tutorial, we demonstrate how to manage the OpenAI chatbot integration, to make the most out of AI without writing a single line of code, if it’s not your forte or you simply want to speed up chatbot development.
OpenAI Chatbot Integration: How to Speed Up the Bot Building Process
In this tutorial, we will walk you through the process of creating a WhatsApp survey and managing the OpenAI chatbot integration. The bot in question is a customer satisfaction bot that we want to use to collect data and summarize the incident in case a customer is unsatisfied with the service/product.
1. Setting Up the Accounts
First things first, in order for you to be able to create the chatbot, you will need to sign up or log in to Landbot and OpenAI accounts. As soon as you create your OpenAI account, you will be redirected to the API key area, very important for later in the process. If not, simply click on your account icon in the top right corner and select “View API keys.”
2. Outline of Chatbot's Purpose and Flow
We want this AI bot to collect a customer order number, email address, and a summary of the issue. Then, have it connect and send this information to our customer support team database—in this case, stored in Airtable—so they can check that the email and order number are verified.
We are going to divide the chatbot flow into two main parts.
The first part will be a loop where, thanks to the OpenAI chatbot integration, GPT will ask for the necessary information from the user. The dialog will continue until the bot has all the data it was instructed to gather.
In the second part of the flow, the bot is going to create a summary of this conversation and send it to our Airtable database.
3. Set Up an Opening Message
First of all, we need to set up the first outreach message that the customer will receive via WhatsApp. The idea here is to use a straightforward, structured button question to assess the experience:
Since this is a WhatsApp bot, it means you will have the user's name automatically, without asking because the customer would have already opted in, and you can use it to personalize the conversation from the start.
Now, since the OpenAI chatbot integration is designed to collect incident reports, only the users who click on the sad face are going to talk to the AI bot. Those who click on the very or mildly satisfied option will receive a simple “Thank you” response and the chat will be closed.
Since the majority of the responses lead down this path, the purple (default) button connects with the “Thank you” message and closes the chat. The default button comes in particularly handy in cases where you have many different responses that are important but don’t affect the direction of the flow. There is no need to drag an arrow from each response individually.
Now, we will drag a separate arrow from the “sad face” button to create an AI flow for dealing with unsatisfied customers.
4. Set up a Conversation “Memory Box”
Before anything else, we are going to create a storage container for the chat. Think of it as an empty box you will fill with the conversation as it is created.
Why is this step so important?
When we send this box to the OpenAI chatbot integration, it will act as its memory so the bot doesn’t ask for the same information twice and will help us maintain a logical and helpful conversational flow. Without it, the chatbot would just continue asking the same questions over and over again.
To create the “box,” we will use the “Set a Field” block:
It’s important to note that the field storing all the conversation data is in an array format. The “Type the value” field includes empty square brackets, which represent the empty container where the conversation will be stored.
Note: You can always create your own fields to store data collected by the chatbot. The best practice is to give them descriptive names (e.g., “@conversation_history” in our case) so you know exactly what information they are storing.
5. Connect the Flow: OpenAI Chatbot Integration
First, we will use the “Set a Field” block again to store our OpenAI API key.
Now it’s time to go to your OpenAI account and copy your API key, as we mentioned at the beginning of the article.
Now, we will explain the OpenAI chatbot integration using the Webhook block to connect both platforms.
Click on the Webhook block to open the editor.
To populate this section, you will find this information in the OpenAI API docs under Completions > Create Completion.
All you need to do is copy and paste the URL and set the command to POST.
Next, move your attention to the “Customize Header” segment of the Webhook editor.
This is the space where you need to enter the API authorization.
You can find this information in the “Example request” in the “Create Completion” segment.
The customer header key is “Authorization” and the value is the word “Bearer” followed by a single space and your API key.
Since you created the API key field, the chatbot automatically pulls your API into the Webhook using that field.
The next section in the block that requires attention is the “Customize Body” part.
Here, you need to refer to the “Example request” section in the OpenAI Completions documentation and copy the following:
This sequence specifies the following information:
- GPT model
- Prompt: the set of instructions OpenAI that clarifies the purpose and scope of the request. It should be as clear and straightforward as possible, leaving no room for interpretation. Feel free to repeat commands several times to achieve this. A clear prompt helps prevent hallucinations.
- Maximum Tokens: Tokens are the basic units that OpenAI models—including ChatGPT—use to compute the length of a text. They are characters or groups of characters which may or may not correspond with words. But for the purpose of simplification, you can think of them as the number of characters.
- Temperature: The GPT model temperature setting is an indication to the AI of how much it is allowed to improvise and “get creative.” The lowest possible temperature is 0 and the highest is 1. The increments are measured in decimals, for example, 0.1, 0.2, etc. We set our bot’s temperature to 0, meaning it can only work with what’s there. You might want to increase the temperature if you are working with more creative use cases.
An example of a prompt could be as follows:
A client (@name) is contacting us because something went wrong. You must act as a friendly agent in charge of collecting a clear idea of what went wrong with the order, you need to ask them. We know there was an issue but we need to know what it was, so you need to find out. Also, get their email address and order number (don't show the summary to the user and do not create any info). Ask only one question at a time and be friendly. Your job is not to give support, only to collect the information. Don’t create any information, it must be given by the client. Here's your conversation history with the client: @conversation_history, once you've gathered all three pieces of information from the client and they no longer need help say ‘An agent will look into this’, be sure to use the keywords ‘An agent will look into this’ only when you have a clear summary of the issue (at least one sentence from the user), an order number, and an email address and the client no longer needs help. Client: @user_text. You: \n"
This prompt forbids the bot to create information, collecting and summarizing the key points instead. It also introduces the field that stores the whole conversation so the bot can access the updated memory with every loop turn-around. It also tells the bot that user-created responses are stored as @user_text. And, since it’s a completion model, it is designed to complete the dialog, hence the “You: \n” prompt, so the bot knows how to complete the responses to user inputs.
The next section in the Webhook editor is “Test Your Request.” To make sure everything is working correctly, you just neeed to click on the “Test the request” button.
If the connection is working OK, you will receive a 200 response.
We are about to finish our bot! The remaining sections we should take care of are “Save Responses as Fields” and “Response Routing.”
The “Save Responses as Fields” allows us to save the OpenAI responses under a field and show it to the user.
The “Response Routing” does a bit of damage control. If the OpenAI-chatbot integration connection is working well (200) the flow continues into the loop we designed. If there is an error (429) and the OpenAI servers are overloaded, it allows you to reroute to the “Try again later” message.
This is it for the Webhook block. Now let’s take a deeper look at the Formulas block.
6. Set Up the Formula Block to Push Conversation into the “Memory Box”
While the error 429 Webhook response reroutes the users to the try-again-later message, the 200 response leads them into the conversational loop with the OpenAI chatbot.
As mentioned, for the conversation to actually work and lead somewhere, the bot needs to remember the data collected in every step. To do that we need to make sure to store the conversation inside the “empty box” we created at the beginning of the flow using the Set a Field block.
To fill it in, we are using the Formulas block. In short, this block allows you to perform actions that normally require some coding, such as more complex calculations, advanced formatting conditions or changes, etc. But instead of coding, you use functions more similar to formulas in Google Spreadsheets.
If you wish to learn more about how the Formulas block works and what you can do with it, check out our video tutorial below from our Landbot Academy.
You only need to know we are using the “Push” formula to push the @response (OpenAI) and, later, @user_text (user input) inside our empty conversation memory box @conversation_history.
This is what the formula looks like:
Push(Push(@conversation_history, '@response'), '@user_text')
7. Create the Conversation Loop
Once we have a formula that ensures the conversation history box is filled in, we can proceed to the actual conversation.
The Formulas block takes us to the “Ask a question” block. Instead of a fixed text, the bot's “question text” field is populated with @response field that stores and showcases the OpenAI response, whatever that is in that particular space in the conversation loop. The user is given the space to answer in their own words, and their answer is stored in the @user_text field.
In theory, you could loop back to the Webhook block from here already. However, that would create a never-ending conversation. That’s why, before connecting back to the Webhook and closing the loop, we need to give the bot a way out if it has all the information it needs.
8. Set Up a Condition to End the Conversation When the Goal is Reached
Using the Conditional Logic block is a very simple way to conclude the conversation and gather all the data.
Since, in our prompt, we instructed OpenAI to say, “An agent will look into this” when it had all the necessary information, you can use this response as a condition to let the bot leave the loop:
This way, if the AI bot says the agent will contact the user, the flow will take it as a signal that all the necessary information has been collected and take the green output towards the EXIT flow. If the @response does not contain the word “agent,” it will take the pink output and take the conversation back to the Webhook block to repeat the loop.
9. Define the Exit Flow
The easiest thing to do here would be to copy the Webhook block you set up previously, go to the “Customize Body” and simply update the Prompt to deliver new instructions.
If you don’t want to deal with the Webhook block, you can also use the same trick as before: employing the “Set a Field” block.
So, if you want to modify the final prompt, just open the “Set a Field” editor and apply changes to the “type the value” field.
Inside the Webhook, the prompt should look as follows:
Since it includes the field @final_prompt instead of a fixed text, any changes you make to the prompt inside the “Set a Field” block before will be reflected automatically.
The final response from OpenAI will include a summary of the incident with all the information from the customer you need to resolve it.
10. Send Information to your Database
The last thing left to do is to send the collected data to our database. To do that, you can use the Landbot native integration.
To learn how to integrate Airtable, see the short tutorial below:
However, feel free to substitute the Airtable block with the Google Spreadsheets integration block if it suits your needs better. Below, you will find more information about the process you need to follow to connect your chatbot with Google Spreadsheets:
After sending the information to the selected database, we will add a block to say goodbye and close the chat.
11. How to Avoid Common Errors
Bot building is an adventure, and details can escape your attention. So, if things are not working, make sure:
- Field names are spelled correctly
- None of the fields were accidentally deleted
- You are using the correct OpenAI API Key
- You are using the right GPT model
- Your Tokens are not expired
And that’s it!
You have managed to complete the OpenAI chatbot integration and now you have a working WhatsApp bot to deal with customer feedback efficiently!
If you need further information on how to deal with the integration of Landbot with the OpenAI Assistant you will find more details on our Knowledge Center, as well as everything you need related to the latest version, GPT-4.
To Wrap Things Up
We hope you have found this article helpful on your journey to creating bots that will improve customer experiences and help grow your business.
Remember that if you are interested in more resources related to integrating OpenAI with your chatbots, be sure to check out our Knowledge Center and also keep an eye on our blog for more tips and tutorials!