close
close

Cresta CEO Ping Wu: Many contact centers underestimate the value of LLMs

Cresta CEO Ping Wu: Many contact centers underestimate the value of LLMs

Ping Wu, CEO of Crestahas identified three common misconceptions that contact centers face before implementing large language models (LLMs).

LLMs support generative AI (GenAI) use cases in the contact center, the enterprise, and beyond.

Most have used an LLM while experimenting with ChatGPT or Gemini, but this first-hand experience has led to widespread misconceptions about how contact centers – and the business as a whole – can leverage this technology.

As mentioned, Wu – co-founder of Google’s Contact Center AI Solution in 2017 – boiled it down to three key misconceptions.

First, people often think that LLMs are simply end-to-end text generation machines. You ask a question and get an answer. But it’s more complex than that.

“There are two parts: understanding and action (decoding),” Wu explained.

In many business contexts, LLMs are more useful for understanding and determining user intent while the output action is guided by business logic.

Second, many customer service managers believe that an LLM is limited to answering questions.

But as Wu notes, “They are also very good at synthesis, extracting key concepts from large amounts of text.” Case summarization is a mature example of this in the contact center.

Finally, LLMs often apply the intuition of human intelligence, but there are differences that business leaders must consider. In fact, some tasks that are difficult for humans are easy for LLMs, and vice versa.

Wu gave an example: “LLMs can pass the biology postgraduate exams, but simple customer service questions can be answered incorrectly without proper guidance.”

After going through these three misconceptions, Wu highlighted how contact centers can better manage LLM outcomes, gave great examples of well-implemented GenAI, and more in an interview with CX Today.

The interview is part of our 2024 CX Trends series and is available below.

For those who want to skim the interview, here are some more highlights.

The two methods for implementing LLMs

According to Wu, there are two common approaches to implementing LLMs in the enterprise.

The first is to optimize the LLM with application-specific data. For example, GitHub’s Coding Copilot optimizes the model using a coding repository to generate better quality code.

The second approach is retrieval-augmented generation (RAG). Wu explained how it works:

First, relevant business data is searched for and retrieved and then fed into the LLM, which synthesizes the information to answer questions by analyzing multiple documents.

Companies that follow the RAG approach must ensure that their knowledge centers where these documents are stored contain accurate and up-to-date information.

High-profile failures by New York City and Air Canada in the use of AI in customer experience provide hard lessons here.

Fortunately, companies can use LLMs to enhance the content in knowledge centers.

Wu explained, “AI identifies outdated content and gaps in the knowledge base by analyzing conversations, helping to keep the knowledge base up to date and accurate.”

With this in mind, contact centers can leverage GenAI to enhance knowledge that enables GenAI use cases for customers and agents. It’s a powerful cycle!

But the value of LLMs as knowledge enhancers does not end there. In fact, they can add value in two other ways.

The first option is semantic search. Because LLMs can understand queries and documents at a semantic level, they can significantly improve search quality.

In addition, LLMs can add value by aggregating information from different documents to generate direct answers to customer queries, which is useful for real-time customer support.

Wu: Future contact centers will be hybrid human-AI systems

“We look forward to transforming contact centers with AI and aim to build an AI-native contact center,” Wu concluded.

Future contact centers will be hybrid human-AI systems in which AI assists human agents, learns from them, and improves over time.

“We expect to see increasing automation of conversations, improved human capabilities, and more sophisticated AI that can handle multimodal tasks and interact with both voice and screens.”

However, the CEO also warned that there will be more spectacular GenAI failures if LLMs are used improperly, such as when a chatbot sells a car for a dollar or promises refunds that don’t exist.

Therefore, it remains crucial to identify suitable use cases and implement guardrails.

CX Trends: Catch up on the entire series

As co-founder of Google’s Contact Center AI Solution and current CEO of customer service disruptor Cresta, Ping Wu is a global thought leader in AI and LLMs.

In recording this video, he joined 14 other subject matter experts (SMEs) as part of the 2024 CX Trends series, each of whom shared their thoughts on the hottest topics in customer experience.

This includes proactive and predictive customer support, customer experience design and conversational intelligence with speakers from companies such as Google, Lenovo and Zoho.

Each speaker shared key insights and you can read everything they had to say by checking out our 2024 CX Trends series here!