When ChatGPT seemingly burst on the scene late last year, everyone from college students to CEOs took notice. For financial institutions, ChatGPT is just one tool in their digital transformation toolkit. It can refine and complement existing digital banking solutions, but when it comes to improving digital customer service and engagement, it still has a long way to go.
To build out digital customer experience tools, it’s better to go with digital assistants or chatbots that are more refined and specifically trained for banking services. But first, let’s take a look at what everyone’s talking about.
What is ChatGPT?
ChatGPT is a chatbot trained on a massive amount of data. It models the person it is talking to and can engage in a contextual conversation, much like speaking with another person. In certain situations, it would be difficult to distinguish between ChatGPT and another person.
The technology is based on large language models (LLMs), specifically GPT-3.5. A large language model is a tool that can predictively compose text based on patterns it has learned from massive amounts of text data, usually drawn from publicly available sources, such as the internet. GPT, or generative pretrained transformer, is a framework for large language models based on the Transformer architecture for deep neural networks.
These neural networks can track sequence and relationship data — such as words in a sentence — to learn context and, ultimately, meaning. GPT itself has been making headlines since the release of GPT-3 in June 2020.
Limitations of ChatGPT
ChatGPT is like nothing most of us have seen before. It seems knowledgeable and creative, it can write code and poetry, and even create games. More importantly, it is more likely to align its output to the user’s specific goals and much less likely to produce inappropriate or toxic output than previous LLMs.
But for today’s financial institutions, ChatGPT and other similar technologies are still in the early stages of development and, as such, come with a bit of baggage. Here’s why:
- They’re temporal. ChatGPT was trained at one point in time, which means it lacks any information published since then.
- They’re expensive to produce and train.
- As has been reported elsewhere, ChatGPT can lack ethics and can be offensive — not exactly great for customer experience.
- They can be wrong. Very wrong. LLMs like ChatGPT are known to “hallucinate,” or produce content that is not based on any reality.
What chatbots should do for digital banking
Current banking customers are comfortable with using a chatbot. According to a Cornerstone Advisors study: “Among consumers whose bank or credit union has deployed a chatbot, 70% have used it at least once, with about three in 10 having used it three or more times.”
The study also showed consumers’ satisfaction with their digital assistants’ interactions is strong, with half being “very” satisfied and 43% reported as “somewhat” satisfied.
Yet for banks looking to build out their digital banking solutions with a chatbot, or its more robust cousin, an intelligent digital assistant (IDA), ChatGPT is just not ready for prime time. It’s a more basic and generic chatbot that’s not exclusively built for customer service.
It’s better to look for chatbots or IDAs that can address member queries, provide personalized and detailed financial information, help consumers make smarter financial decisions and act as the first encounter with your brand.
There are a number of chatbots out there. But no matter which solution banks choose, a chatbot or digital assistant can be a forward-facing, bank-savvy digital solution that speaks the unique language of their financial institution.
Smart, personable chatbots can do that. ChatGPT? Not quite yet.
Sasha Caskey is the chief technology officer for Kasisto, which makes conversational AI-powered digital assistants for financial institutions.