The releases of Microsoft-backed ChatGPT, Anthropic’s Claude, and Google’s Apprentice Bard have made the world realize, myself included, that Large Language Models (LLMs) are bringing about a revolution comparable to that of the Internet itself. Regardless of whether those models are intelligent enough, these newfound capabilities represent a paradigm shift in how humans can use technology to complement our cognitive capabilities. With it, the concept of AI and how to build AI chatbots has changed forever.
The defining feature of the AI chatbot era will be the automation of many tasks that—up until now—were only possibly executed by humans. Now, our cognitive capabilities will be brought to a higher level of abstraction. Instead of deliberating on the wording of our communications with customers, we will think about the messages and emotions we want to convey, delegating the actual copy generation to AI.
Before ChatGPT, no conversation system in existence could compete with the performance of a human mind in terms of natural language understanding and generation. A chatbot could only handle structured and scripted conversations, relying on human intervention whenever the end user went off script. This LLM has closed the chapter to this era. Despite the limitations on groundedness, factual knowledge, inevitable hallucinations, etc., it’s difficult not to be impressed by its capabilities. And why is that? It’s because ChatGPT has anthropomorphic interactions with people. In other words, human-like conversations with AI chatbots are a tangible reality. A reality that we are experimenting with at Landbot right now to make frictionless conversational experiences across the customer journey.
Where does that leave us in the midst of this new AI chatbot ecosystem? Without further ado, I’d like to introduce you all to Landbot AI—but let’s cover some bases first.
LLMs: Why Do They Matter?
Large Language Models are AI systems that have been trained on massive amounts of textual data: books, web pages, manually curated examples, as well as software code. Their training paradigm is simple. They adjust their model parameters so that they accurately predict the next token in a sequence (words, code statements, etc.) The discovery of this decade is that with large enough models and data, LLMs can accomplish more complex tasks: answering questions, summarization of texts, language translation, as well as few-shot learning—the models can learn to perform tasks correctly with fewer tries. Given a prompt, the AI is able to complete the task at hand. This means LLMs can generate text similar to how humans speak and write.
So what’s the major hype about ChatGPT? ChatGPT has demonstrated that a single LLM, with minor customization, can eliminate the need to train Natural Language Understanding (NLU) and Natural Language Generation (NLG) models. All you need to do is give the LLM a prompt explaining what you want it to do and the AI will do it for you. Apart from the fact that ChatGPT has been trained with massive data sources, its ability to interpret a request, tailor content according to that request, and “remember” what was said earlier in the conversation make it ripe with potential.
Limitations of LLMs Today
As of today, LLMs do not come without caveats. Social media is exploding with new applications being built on top of GPT-like models; LLMs in other words. With all the noise, it’s easy to forget that most of these products and applications are just proofs of concept—not fully functional products that can be sold or offered at scale. So, what will it take for some of these ideas to be market ready? Below are some of the main limitations mentioned earlier and worth highlighting:
- Lack of groundedness: Trained on text data alone, an LLM has no way to understand what the symbol “cat” in a sentence actually means. It only knows that “cats” are related to other words such as “meowing” or “hissing”. Nevertheless, this limitation is improving with the use of more datasets like description-code pairs or instruction datasets.
- Hallucinations: Yes, you read that right! At times, LLMs can generate complete nonsense. The output has the right “form,” but the content makes no sense. This tends to happen when the model doesn’t have access to information it needs to answer the query, so it makes them up!
- High latency times: LLMs take their time to generate responses. This means LLMs are not suitable for applications or products that need fast response times (less than a few seconds).
- Expensive to train and deploy: LLMs need to be deployed on specialized, expensive hardware, even to meet the slower output speeds they are challenged by today. They are also trained on terabytes of data. The estimated cost of training a billion-parameter LLM model is around one million dollars.
Today, ChatGPT (or other LLMs in general) are not ready to become a fully productized conversation system, yet. While there are other “cons” to take into account, many will turn into “pros” as the models get smarter and more democratized in the next year. So, what does this have to do with how to build AI chatbots and where does Landbot AI come into play? I’ll tell you now.
How to Build AI Chatbots that Solve Real Challenges
We are experimenting in the AI chatbot ecosystem to help businesses overcome the challenges they’ve faced in the past when it comes to conversational automation. With more people developing solutions on top of GPT-3 and other LLMs, the need for those solutions to meet existing software development requirements still stands. Even for no-coders, designing the proper UX that aligns with your brand, building the bot in collaboration with peers and deploying it at scale are all essential to the AI chatbot building process.
There are other fundamental needs when it comes to chatbot building that are unlikely to change even as more AI-powered chatbot solutions become available. From our perspective, these are:
- Designing task-specific conversational experiences: No matter where your customer is on the journey, businesses will still need to think specifically about the experience they are creating for end users. AI-powered chatbots do not change the need to design frictionless experiences that alleviate pain points for customers to successfully acquire, nurture, and retain them.
- Optimizing chatbot flows based on user behavior: AI chatbots become more intelligent over time. This is part of their draw in the market we are seeing right now. Nevertheless, companies still need to analyze the performance of the bot and to optimize parts of the flow where conversion rates may drop off based on how users interact with the chatbot. AI or no AI.
- Integrating seamlessly with third parties: Building AI chatbot solutions does not remove the need for easy-integration with third-party platforms. Regardless of the data captured by the bot, it’s what happens with that information that matters. The chatbot will still need to create, retrieve, and update the data obtained from conversations properly in the techstacks/CRMs used by your teams. Seamless integration still matters.
- Providing chatbot assistance on different channels: Chatbots can and should be deployed across the different channels your customers use: WhatsApp, website, Messenger, etc. Using AI does not negate the fundamental need to meet your customers where they are—and engage them through friendly conversations.
Landbot AI: GPT-3 Integration Made Easy
I’m pleased to announce that integrating GPT-3 into your chatbot flow is now available to ALL Landbot users! A core value of Landbot is to connect easily with other applications. This core value also applies to ChatGPT. While this integration comes with the existing limitations associated with GPT-3 and other LLMs models today, it will help you build your AI-powered chatbots even easier. We want to reiterate that the integration you are about to see is not a native integration. In the example you’ll see below, we’ve used this GPT-3 integration to help Customer Service teams answer common customer queries more efficiently with ChatGPT using a FAQ bot.
Let me take you through a brief explanation and show you how we used this GPT-3 integration to create a FAQ bot. You’ll also be provided with a how-to tutorial, a FAQ bot template, and have the opportunity to play with our FAQ demo bot to get inspired in your AI chatbot building process.
- GPT-3 Integration to Create a FAQ Bot (Available): Do you find yourself providing customers the same answers to the same questions over and over again? Maybe you also have a FAQ website page or Knowledge Base that nobody seems to read! Let our GPT-3 integrated FAQ bot give you a break! Whenever a customer asks a question you already have a documented answer to, this bot will “read” the content you’ve fed it. Instead of sharing links to related content, it will compile a summarized explanation—just like a human agent would do for the end user. Ready to have time to focus on more meaningful conversations? We’re betting you are.
*The GPT-3 Integration and FAQ bot template, tutorials, and demo bot are available to ALL Landbot users. This means you can use these materials to build your AI chatbots.
*BONUS: Below you’ll find useful information so you can build an FAQ bot using our GPT-3 integration, just like we did.
- Read our How to build a FAQ chatbot with GPT-3 tutorial for step-by-step guidance
- Access our FAQ bot template here (tutorial included) and follow the instructions for a fast build.
- Access our FAQ demo bot here to get inspired and see what you can do (reliability issues courtesy of OpenAI servers may apply).
Landbot AI: Open & Closed Beta Versions
Our Product team has also been experimenting with two beta versions on our platform now. What does this mean? It means that these AI features or components are being productized inside Landbot, but access is limited or the feature is not entirely reliable due to the limitations of the ChatGPT LLM (explained above). We have different options for you to test out yourself (or get on the waitlist) as we continue developing them:
- Text-to-Bot (Available): When it comes to building bots, many people don’t know where to start! Don’t worry, we’ve got you covered. All you need to do is write a prompt describing what you want your bot to do. Would you like your bot to generate leads and ask for contact information, or make appointments for a service you provide? Consider it done. You can even tell us the tone of voice and language you’d like your bot to use so it adapts to your brand’s identity. With that information, your initial chatbot template is automatically generated on-the-go. Let Landbot AI transform your text-to-bot, so you don’t have to. The only limit is your imagination.
*This feature is in open beta version. This means it is available to ALL Landbot users.
- Natural Language Understanding (Waitlist): Everyone expresses themselves differently. Moreover, data extracted from conversations needed to match a specific input format: numeric, text, language, etc. We are currently developing a Natural Language Understanding (NLU) component that can understand users' intentions, no matter how or in what language they express themselves in. With human-like comprehension levels, our NLU component can help direct your users to the proper flow, related to the task that matches their intention.
*The NLU component is in closed beta version. This means it is NOT available to all Landbot users. We are currently accepting waitlist applications from users to participate in the NLU building process via interviews, user tests and private access.
Building AI Chatbots: To AInfinity and BOTyond
Whether it’s to generate leads, launch promotional campaigns, automate processes, or provide quality customer service, our mission has been to help businesses build frictionless conversational experiences from end-to-end. The vision for our no-code chatbot builder is based on turning conversations into profitable outcomes, tripling efficiency through automation, and cutting operating costs. Regardless of the channel—WhatsApp, website, or Messenger—our goal has always been to enable anyone to create automated chatbot flows that better engage customers no matter where they are on the journey. We believe the core of any good business is based on relationships. And relationships are built on top of conversations with customers.
ChatGPT and other LLMs have only set the bar higher in terms of consumer expectations for the conversations they have with brands. And this new technology has added fuel to the fire for us Landbotters. With it, we can help our clients build AI chatbots more efficiently by reducing development and deployment times—without sacrificing the experience.
The future of conversational automation is here. Now the time has come to invite those of you in our community to join us in the AI chatbot revolution. I am pleased to announce that Landbot AI is officially launched! Are you ready to start building AI chatbots with Landbot AI?