IPsoft has used natural language understanding technology in its Amelia digital worker so that users can have a normal conversation with the bot
Amelia is an AI-based “digital worker” created by US software company IPsoft that can be hired in a similar way to human employees to carry out key repetitive tasks. Dan Robinson learns more about the bot its maker claims to be the most sophisticated digital assistant in the world
From checking for train delays to playing music, technology like the Amazon Alexa and Google Home virtual assistants are making our lives easier every day by enabling us to make simple voice commands.
But anyone who’s used these popular devices can also bear witness to the fact there can be some instructions that get lost in translation.
This is one of the overarching barriers to artificial intelligence reaching its full potential that Amelia – dubbed the “world’s most sophisticated virtual assistant” – aims to bash down for businesses rather than consumers.
While the natural language processing technology (NLP) behind Alexa and its contemporaries is able to understand key words and phrases, this doesn’t always enable the machine to understand humans when they communicate more naturally.
Instead, Amelia’s developer IPsoft says a more sophisticated natural language understanding (NLU) is required so an AI bot can identify the different contexts and real meanings behind words by learning to understand the world around them.
Chetan Dube, CEO of the US-headquartered software firm, cites Gartner research that estimates businesses achieve only a 6% return on AI investment but claims Amelia’s clients can exceed 35% by integrating automation systems.
“The problem with AI is business users are sending their requests in natural language,” he says at the company’s Digital Workforce Summit in New York City.
“All your systems that are operating all take structured input – they all expect a certain payload, get fed certain parameters and will provide results.
“The problem is the human business user is talking in unstructured form through natural language, so there’s a big chasm we need to cross.
“Only by the fusion of your cognitive automation spin can you make an effective end-to-end user experience and achieve the promise of AI.
“Amelia brings together deep neural networks – the connections within data – and logic so it can have a deeper contextual understanding of what’s being said.
“It’s about saying ‘I’ll have a Starbucks if it’s raining’ and having the flexibility to switch between contexts, which is easy in our brains but not so much in machines’ brains – while adding the emotional connection that makes a customer experience rich.”
What is Amelia? How IPsoft has created a digital worker available for hire
Amelia has the ability to hold natural language conversations across chat, voice or social media platforms and connect the information to data that will help provide a personalised service.
It can then integrate into a company’s existing IT systems to execute processes end-to-end, monitoring how everything works and learning best practices by observing top employees – before adapting accordingly through machine learning.
Vodafone is one of the companies using Amelia and achieved an 81% reduction in mean time to repair (MTTR) – which calculates how long it takes to resolve a customer problem – while other clients to improve efficiencies include investors BlackRock and Franklin Templeton, education company McGraw-Hill and the Bank of Montreal.
It has also resolved more than 85% of queries addressed to the bot at Sweden’s SEB Bank and covered 82% of IT service desk requests across 80,000 conversations at a global telecoms provider.
Patrick Marlow, director of engineering at IPsoft, says: “If an issue gets escalated beyond Amelia, she never leaves the conversation and is learning from every interaction.
“She’s looking at the things an agent will say to the customer and is dynamically grafting on new pieces of a business process that can be sent to an administrator afterwards to say ‘should I ask these things next time I have an interaction’?
“She can use this self-learning to better her business processes and, ultimately, we can retrain our human agents if they haven’t followed this process.”
While Amelia is becoming established as a digital worker for a number of large companies, IPsoft now wants to make it more accessible.
To that end, it has created the 1Store, which lists 672 skilled digital employees that can be “hired” by companies to perform tasks in industries ranging from banking, insurance and retail to healthcare, telecoms and media.
Businesses can go through the same hiring process they would with a human, including looking at previous references from other firms that have analysed how well bots have worked, as well as “interviewing” them to see how they would perform certain tasks.
They can choose the best tool for their requirements by answering a series of questions relating to aspects such as the industry they are involved in, the challenges they want to solve and the channels and domains in which they wish to operate.
AI challenges IPsoft is trying to solve through Amelia
Mr Dube highlights a number of reasons for the 6% return on AI investment estimated by Gartner, namely blaming the lack of end-to-end AI ecosystem within business.
“You can have just about every automation tool that exists under the sun – machine learning systems, ticketing or chatbots – but there’s no automation backbone that links all these systems together today in a cohesive fashion.
“How would a human function if they’d didn’t have a spine that connects the brain with all our cognitive movements?
“In the enterprise, they don’t have something that glues it all together to make it efficient from end-to-end.”
Another problem companies face is the execution delay that exists between front office intermediaries and ticketing systems – a barrier the likes of Uber have broken down via an on-demand model that directly connects a customer request with the end product.
Difference between IPsoft’s Amelia and Alexa
Mr Dube adds that customers are “difficult” because they require real cognitive capabilities, such as using natural speech when communicating with a business.
While this is something digital assistants like Amazon’s Alexa and Google Home aim to provide, the communication can be rigid as the technology doesn’t yet fully understand natural speech.
“Customers want someone picking up the phone and understanding them,” says Mr Dube.
“With Alexa, you can turn the lights on, choose a song and order a cab, but these happy case scenarios are 14% in the real industrial.
“No one is calling in and saying ‘I’d like a new auto insurance policy because I’m moving from Ohio to New Jersey with my wife but she has a different discount code with her company. Can we merge our discount codes to give us a better rate?’
“That’s real. You can’t fake it by putting those words in a bucket and coming up with a solution – you really need a brain to clone human intelligence.”
Greater sophistication has been required in business automation systems so the machine fully understands the context of a human request.
Director of engineering Mr Marlow says: “When you think of the virtual assistants consumers use, a lot of them are single turn-based – turn the air conditioning on, look for the nearest Starbucks, is my train on time?
“The likes of Alexa and Google Home are great for the answers to the frequently asked questions and give the impression of natural language processing but there’s a lot of syntax involved.
“I can say ‘set the downstairs temperature to 72F’ because I’ve been trained to say it in that particular order.
“But humans aren’t linear in communication methods. In the real world, I’d be saying ‘this is really hot in here, I’m going to have to take my jacket off’.
“So to polish that out, you need a more sophisticated natural language processing system.
“A full-blown difference in Amelia is multi-tiered. It’s not just natural language processing, it’s multi-entity extraction using entities across domains.
“It’s allowing you to micro-tune the perimeters to get the exact transaction.
“There’s a lot of things that make her a lot different to just being a digital assistant.”
The AI technology IPsoft has used in developing Amelia
IPsoft’s main goal when developing has been to create “human level understanding”, according to Christopher Manning, a Stanford University professor of computer science and linguistics who helped to build Amelia’s DNA.
“The essential advantage of human beings is they have a lot of contextual knowledge and flexibility in how they go about doing things,” he says.
“We have cognitive processes where we reason and plan, where we understand the consequences of taking a certain course of action.
“So we’re taking this one set of reasoning and we’re converging the more symbolic parts of AI with the more statistical machine learning parts.
“We will only reach the ultimate power of AI once we stand on both sides of that continuum.”
Prof Manning says conversations between humans and machines are heavily scripted at the moment, where the computer is able to recognise certain “intents” such as a request to cancel a policy or change address, but it can be too easy for these to be wrongly interpreted.
He adds: “That’s not how the human conversations work – we are very good at balancing things and switching between contexts, and that’s a versatility and richness we want in Amelia.
“A lot of the challenges that still exist aren’t purely linguistic challenges, but challenges of understanding the world, because we need to understand how people’s lives work.
“The world is a very complex place and having that high level of complex understanding is still a long way away.”