Skip to main content

IBM Watson and Autodesk reinvent customer service

Image Credit: Shutterstock

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


Every year, companies spend 1.3 trillion dollars on 265 billion customer service calls. That’s five bucks a call. On average, the cost to find and hire a call center agent costs $4000 (not including salary), with an additional $4,800 for training — and with frustrated agents tending to drop like flies in the face of an often brutally stressful job, these costs mount up.

AI, or what IBM calls cognitive computing, is changing that. Autodesk began piloting the IBM Watson Conversation Service in June 2016 as a virtual agent called OTTO, later enhancing it and renaming it AVA (Autodesk Virtual Agent) in February 2017. The return on investment has been tremendous, says Rob High, IBM vice president and Watson chief technology officer and one of the featured speakers at VB Summit coming up on October 23 and 25 in Berkeley, CA.

With AVA, Autodesk has been able to reduce time-to-resolution from an average of one and a half days to literally minutes – a jump in speed of 99 percent. An automated case takes only 5 to 10 minutes to solve — and that’s just because a customer can only type so fast.

With their problems resolved more quickly, customers are happier and their satisfaction shoots up, the company reports. With AVA and a scripting tool already in place, they’ve seen customer service levels rise by 10 points.

VB Event

The AI Impact Tour – Atlanta

Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.
Request an invite

But a virtual agent also offloads some of the more stressful and tedious activities that result in agent attrition. Not only are these agents on the front line of assault from often irate and impatient customers, much of their time can be spent answering trivial or mundane questions, and often the same ones over and over (and over).

Shuffle these routine questions off to the cognitive computing-powered conversational agent, and the human agent is under less pressure, and gets to tackle the more intellectually challenging questions — the kind that make their job feel more worthwhile.

“It’s helping on all fronts,” High says. “Happier customers, more satisfied workers, and better operational performance.”

Beneath AVA’s hood

Every online customer interaction starts with AVA, which sits on the front end for both web and chat inquiries. These make up about 80 percent of Autodesk’s support contacts. If the virtual agent can’t resolve a situation on its own, it collects enough information to create a case and forward the ticket to the correct human agent.

AVA is powered by natural language processing (NLP) and deep learning. Autodesk has pumped it full of chat logs, use cases and forum posts, feeding an analysis of over 14 million sentences for keywords, entities, phrases, clusters, as well as syntax and idioms, in turn training the virtual agent to understand the subtleties and nuances of language.

Customers can type in their queries as if they were chatting with a human agent; AVA understands a broad array of customer inquiries, and uses keywords and phrases to quickly unpack the conversation’s context and purpose and return high-confidence answers quickly.

Internal subject matter experts are also adding their own database of knowledge to the mix, as well as supervising the continuing training of the algorithm to continue to fine-tune the solution’s familiarity with customer vocabularies and continue to get smarter and faster over time.

Beyond the bot

The term “bot” for these kinds of virtual service agents is the common vernacular, but tends to refer to the fairly simple solutions in the marketplace today, which take a single command and resolve it, High says.

“The difference between the bots that we’re most familiar with and conversational agents is getting underneath the thing you’re saying,” High says. “Not just interpreting intent, but navigating into core issues.”

Consumers have a long history of comfort in engaging with bots on instant messaging networks, from the rules-based, scripted SmarterChild back in 2000s until today, where they’re ordering toilet paper in their living room from Alexa. In fact, they’re interacting with the tech on a previously unimagined level.

NLP-powered bots can create even better customer experiences. When your agent can understand your customer’s personality or emotional state, it can determine the best path through that conversation, and even how to strategize.  Does it give a simple answer or a more involved, elaborative answer – or does it respond piecemeal, letting the user digest each part and following on from there in a logical progression?

For instance, when somebody says “What’s my account balance?“ your typical bot will look up your account information and deliver your balance. However, typically when someone asks for their balance, there’s an underlying intent: they’re getting ready to buy something, or perhaps plan a trip or save for something larger.

Then there’s context. If someone is asking a customer service question from their smart phone or the speaker phone in their car, they don’t want to get into a long conversation and encounter further interrogation.

There’s also historical data. If they’re a long-standing customer, how many people are in their family, what kind of transactional data do they show, what kind of previous conversations have they had – all that might inform what the conversation looks like, and what the system will do next.

So unlike a single-purpose bot, a conversational agent assumes there is a problem you want solved, and will give you your balance, but will extend beyond that to anticipate and discover what more is needed.

“It’s the idea of incorporating a discovery process into the conversational system,” he says.

“We think the conversational agent will have the most value when it can delve into what’s behind the question you asked.”

One of the fears that continues to haunt artificial intelligence is that it will eventually replace humans in the workforce – including entire customer service teams.

But cognitive computing, or augmented intelligence, High says, demonstrates the continuing human ability and drive to build tools that help us push forward, from the seed drill and the horse hoe of the agricultural revolution to the leaps in analysis now possible.

Human beings are good at what they do – but there are things we don’t have the mental capacity for, High says. A doctor can’t read the more than half a million articles PubMed issues every month, for example. And too often, experts can be constrained by our biases into a reinforcement loop, in which you’re convinced that the way you think about a subject or problem is the right and only way – until you encounter a problem that doesn’t fit the way you think about it.

“Tools have been most durable when what they’re really doing is amplifying human beings, amplifying our strength, amplifying our thinking, amplifying our reach,” High says. “Cognitive computing is powerful because of the combination of the tool and the human together, and that’s where the technology needs to evolve – and where the economic utility lies.”

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.