Cochlear plugs AI into its global contact centre operations

Cochlear plugs AI into its global contact centre operations

Cochlear is starting to make use of the AI capabilities in its Amazon Connect contact centre platform, automating its evaluation of agents and calls against a set of quality and compliance objectives.

Cochlear plugs AI into its global contact centre operations


Cochlear’s Andy McLaughlin at AWS re:Invent.

The Sydney-headquartered maker of implantable hearing technology re-platformed its global contact centre operations in 14 countries to run on Amazon Connect two-and-a-half years ago.

Cochlear’s contact centres support a range of functions, IT networks and telecommunications director Andy McLaughlin told the AWS re:Invent summit in Las Vegas.

“We have business-to-business services, business-to-customer services such as device support, orders and insurance, a ‘pro’ care team which is for clinical support, and sales,” McLaughlin said.

“Customer experience is critical for how we operate. 

“Interestingly, being a company that deals with the hearing impaired, the voice channel is our most popular channel and the way that most of our customers get through to us, and also how we deal with a lot of our business-to-business work as well.”

The project standardised call centre technology and configurations around call queues, and improving reliability and visibility of support services.

Multi-language support made Amazon Connect suitable for Cochlear’s global operations.

The platform also introduced call transcription – a critical underpinning for the company’s automation and AI work now taking place – which is integrated with Cochlear’s Salesforce CRM.

“The dialler is now within the Salesforce CRM,” McLaughlin said.

“Not only that, we have screen pops, so depending on the customer’s calling number, we’re able to immediately pop up that customer’s details, so it allows our agents to get to our customers’ information a lot more quickly.”

Since then, and particularly in 2025, Cochlear’s focus has switched to enhancing Connect-powered support services and decreasing its reliance non-Amazon tools.

“When we rolled out [Connect], we had a workforce management product that was not an Amazon product, we had quality management that was also another product, we had a lot of BI tools, [and] our dashboards used a third [party] product as well,” McLaughlin said.

“We were sending a lot of our data to multiple systems, which our privacy team wasn’t that keen on. So we made the move in June this year to go to Amazon’s unlimited AI.”

Amazon Connect with unlimited AI is much as it sounds: a suite of AI tools that customers can use as little or as much of as they want.

Cochlear is already using AI and machine learning to streamline its reporting, evaluations of agents and calls against set metrics.

In the future, it will also be use it as part of an adoption of forecasting, capacity planning, and scheduling – FCS – features in Amazon Connect, which will allow Cochlear to transform its contact centre workforce management.

Agent and call evaluations

Cochlear has traditionally been constrained in its ability to evaluate contact centre agents or calls.

“Even when we first moved to Amazon Connect, we were only able to evaluate the calls in which the supervisor would be able to listen – three calls per agent per month, randomly selected, and our quality team would listen to another two,” McLaughlin said.

“So, our agents were only getting around five, what we called scorecards, per month done.”

Automated evaluations, using unlimited AI and transcriptions in Amazon Connect, have materially changed this.

“I looked at what happened in October when I was pulling this [presentation] together and we’d gone from about 1000 evaluations per month to 22,000 now. 

“We’re not running this across all teams – we’re still trying a bunch of different use cases for it – but it really has been a game changer and I can’t stress enough that the reason why we can do that is because of the transcription accuracy.”

The automation is both rule and category-based, or generative AI-based.

Rule-based automation is designed to look for a semantic or exact match for specific words or phrases in a call – such as brand terminology or profanity.

“These are really good for compliance checks, such as if you’ve got specific regulatory regulatory compliance requirements and explicit policy violations, because of the exact nature of the text that is being searched for there,” McLaughlin said.

Generative AI automations, meanwhile, can look at whether an agent shows empathy or other soft skills, and uses reasoning for determining quality scores.

“It does provide an AI reasoning and some words [as to] why it’s come up with that particular score as well. These are great for quality assessments and soft skill evaluations,” McLaughlin said.

“Really, the benefit of all this is that because you can score every agent and all their calls, you can now target the actual issues or problems, such as where there’s a common question that might not be being answered correctly by an agent.

“You can target those by looking at the reporting and then use that to have much more targeted training for your agents as well. 

“So it’s a very good thing to have so much more automation and be able to score so much more [contact centre activity].”

Workforce management

Workforce management using FCS – forecasting, capacity planning and scheduling – is now a key focus in the contact centre.

McLaughlin said that as Cochlear had two years of historic data already in Amazon Connect, it was well-prepared to begin using machine learning in the workforce management space.

This is about accurately forecasting call volumes, ensuring the right number of contact centre agents are rostered to work at the right times, and that break management and agent schedule adherence metrics are met.

AI agents answering calls

Further into the future, Cochlear is interested in exploring whether AI agents might be able to answer voice calls.

“We started playing with interactive voice response using an AI agent to answer calls,” McLaughlin said.

“We plugged in a bunch of knowledge base articles – just threw them into an S3 bucket, pointed Amazon Connect at that S3 bucket for its knowledge base and we had a play with this in our IT help desk.”

McLaughlin said that the AI agent’s responses were “pretty decent”, but that productionising the use case would require additional work.

“We had all of this working in a pilot that we ran for our helpdesk a month or two ago, but we have some challenges with our ITSM tool,” he said.

“That’s why we haven’t implemented that fully, but it is something that we’ve [spent] a lot of time [on] to really get to understand how AI agents work.”



Source link