With ChatGPT making a huge splash in the news lately, people are asking what this means for Conversational AI.
The short answer is it will not replace Conversational AI, particularly for customer service and contact centre use cases.
When used stand-alone, it cannot deliver the basic must-have requirements for enterprise use, and above all, is not even designed for them.
This is not intended to denigrate its capabilities nor deny the giant leap for AI that’s been taken. But with great media hype comes great confusion about what it is and isn’t and what it’s intended to do and not do. So, let’s dive in.
7 Showstoppers for a Pureplay ChatGPT Bot in Customer Service
Conversational AI is a conversational interface used for interacting with software, designed around business needs. So, let’s start with the basics for meeting those needs and where ChatGPT stands.
1. Responses Must Be Consistent and Predictable
Customer service deals with the same issues day in and day out. That means a Conversational AI solution for self-service has to deliver consistent, accurate answers every single time, no matter how many different ways the customer phrases the issue. As ChatGPT’s website states:
“ChatGPT is sensitive to tweaks to the input phrasing or attempting the same prompt multiple times. For example, given one phrasing of a question, the model can claim to not know the answer, but given a slight rephrase, can answer correctly.”
2. It Must Easily Integrate With Multiple Critical Systems
It is not connected to the internet nor designed for it. Moreover, it does not offer handy out-of-the-box integrations to your CCaaS or CRM systems for example.
Given that integrations and extensibility are basic prerequisites to taking any kind of action, it means ChatGPT can only serve as entertainment or perhaps write your college essay.
But even if it understands your issue, it can’t help you rebook your flight, return a damaged item or change your mobile phone plan.
3. Learning Must Be Ongoing and Fast
As ChatGPT’s FAQ notes, it was trained on vast amounts of data with extensive human oversight and supervision along the way.
Moreover, it has limited knowledge of the world after 2021 because of its static data set. Thus, its knowledge and capabilities are frozen in time, hardly useful for a constantly changing business environment or when you update your return policy during the holidays.
4. It Must Be Transparent and Auditable
ChatGPT is a black box. That means you’ll be unable to trace a specific answer to its source. A virtual agent that gives all kinds of answers, including different ones for the same issue based on how it’s worded and with no recourse to why or how it gave that answer is simply a no-go.
That can become a compliance challenge for industries like healthcare, financial services, insurance, and more.
5. It Needs to Get Stuff Done
As already discussed, without the ability to continuously learn from constantly changing data, nor to integrate with other systems, it can deliver highly impressive results on a range of topics, but without Enterprise Conversational AI it can’t get anything done for your customers.
6. It Should Be Designed for Your Use Case
In its current form, it is essentially using a chatbot to interact with multiple static and undisclosed information sources.
While ChatGPT is indeed an incredible if not unparalleled leap forward in AI technology, it is in no way, shape, or form designed to accomplish the goals or meet the requirements of a specific use case in a contact centre environments.
7. Unclear ROI
ChatGPT is currently not accessible via API and the cost of a (hypothetical) API call are unclear. However, while there is no publicly available data on the cost of it, one estimate puts it in the millions of dollars per month.
While merely a guess, that doesn’t include the additional resources required, including content, but particularly the human trainers who had to review, supervise and rank data as part of its Reinforcement Learning with Human Feedback (RLHF).
It’s not as automated as people assume. Even if many of the aforementioned problems are solved, it’s unclear if it could be offered as a service, the cost, and what it would take for enterprises to make it work.
As a final note, it’s worth pointing out that GPT is just one of multiple Large Language Model (LLM) projects in development, such as BERT, MUM, and PaML. ChatGPT and its cousin GPT3 have received the lion’s share of media attention but are far from the only ones.
ChatGPT Is Worth Watching & Trying
A conversational AI solution has to quickly offer near-instant, predictable, repeatable answers (solutions) to users, interact with other systems to carry out tasks, learn from users and new data and be transparent and auditable to meet compliance and regulatory requirements.
Finally, it’s gotta get stuff done. After all, no one contacts customer service to chat about physics or ask for help with homework – it’s about solving issues quickly.
OpenAI’s CEO said it best: “ChatGPT is incredibly limited but good enough at some things to create a misleading impression of greatness. It’s a mistake to be relying on it for anything important right now. It’s a preview of progress; we have lots of work to do on robustness and truthfulness.”
Author: Guest Author
Published On: 17th Jan 2023 - Last modified: 9th Dec 2024
Read more about - Guest Blogs, Cognigy