Enterprises increasingly want to know not just what customers are buying, but how they feel, and vendors such as Amazon Web Services and Qualtrics, SAP’s experience management subsidiary, are making moves to capitalize on that desire.
AWS has expanded its voice transcription service with new APIs that attempt to report not just what someone said, but what they meant and, using sentiment analysis, how they felt about it, too.
A week earlier, Qualtrics agreed to buy conversational analytics specialist Clarabridge for $1.1 billion in a play to automate how Qualtrics determines how customers and employees feel. The Clarabridge acquisition fills a gap in Qualtrics’ capabilities that Forrester analyst Faith Brown identified back in 2018, when SAP bought the online survey company for $8 billion. “Clarabridge, when it comes to text analytics, is better than Qualtrics,” she said back then.
Whereas Qualtrics asks employees or customers explicitly, in surveys, how they feel about a particular issue, Clarabridge attempts to glean this implicitly from what they say in calls or write on social media or in emails, using sentiment analysis.
Sometimes referred to as opinion mining, sentiment analysis is the computational process of identifying someone’s attitude towards something based on what they say or write about it. At its most basic — as in AWS’s new service — it involves scoring the language used along a single axis, from negative to positive. Businesses have been doing that for over a decade to find out what employees really think.
More sophisticated techniques seek to classify the emotions expressed — anger, disgust. Clarabridge, and other companies like it, package that functionality up in an application that delivers reports that contact center workers or their supervisors can use to improve their interactions or training.
Feedback on feelings
Amazon Transcribe Call Analytics offers an API or command-line tool that enterprises can build into their applications to automate analysis of interactions with customer contact centers in 21 languages. The basic transcription service has been around since 2017; the new API unveiled on Aug. 4, 2021, enriches the transcription with information about how long each participant spoke, tagging the words the caller used to describe their intention and the various phases of the call (intro, closing), and scoring the variation in sentiment (graded from -5 to +5 for each speaker), speed, and loudness at various stages of the call.
The service won’t work on live calls: It requires that audio recordings be uploaded to the AWS Simple Storage Service (S3). It’s already available in the North America (US West, US East, and Canada), in Europe (London and Frankfurt), and Asia-Pacific (Mumbai, Seoul, Singapore, Sydney, and Tokyo).
Pricing for the new service is based on call duration and starts at around $0.03 per minute in the US East region, dropping by almost two-thirds if call volume exceeds 5 million minutes per month.
Both Microsoft Azure and Google Cloud Platform offer textual sentiment analysis and speech-to-text tools, but neither appears to have packaged them in a single API for incorporation into enterprise dashboards in the way AWS has.
Sentiment analysis is cropping up in other places, too. Qualtrics’ rival SurveyMonkey, which recently changed its name to Momentive, added analysis of open-text survey responses in 10 languages to its paid plans back in April 2021.
All the approaches from simple scoring to tagging of emotions rely on natural language processing, a field that seeks to teach computers about grammar and the meaning of words — or at least to identify patterns and associate numerical values with them.
The field is fraught with pitfalls for the unwary algorithm: sarcasm is an obvious one (“yeah, right”) but there are also people who dodge the issue to avoid conflict — such as the fictional curate who described a bad egg he was served as “good in parts.”