Quality Management for the Modern, Digital Contact Centre

Modern Quality Management
487
Filed under - Guest Blogs,

In this contributed article, Brad Snedeker takes a close look at quality management and what the future holds.

Good quality management (QM) programmes encourage positive behaviour, provide continuous motivation and align contact centre conversations with a vision of excellent customer experience.

Yet, many programmes are growing stale. When this happens, evaluation fails to perfect performance – instead, performance plateaus. Good advisors stay good and mediocre advisors remain mediocre.

If this sounds a little too familiar, read on for highlights from the Quality Management track at Calabrio’s Customer Connect (C3) conference earlier this month where we discussed best practice and QM theory and innovations, which will reinvigorate the modern, digital contact centre. For more detail – sessions from the QM track “The Performance Connection” are now available on demand.

The Set and Forget Calamity

Companies are constantly changing. Corporate goals shift, customer expectations evolve and products/services advance. Quality evaluation criteria may, therefore, no longer be relevant.

For this reason, continuous refinement of QM processes is essential, with regular updates and reviews of evaluation forms. The more a contact centre does so, the greater visibility they have of customer experience delivery.

Yet, many quality analysts inherit old scorecards – built on assumptions of what customers want – and will continue to use them across multiple channels beyond their “sell-by” date. Such a practice is unlikely to drive significant, sustained performance improvements.

It’s a Balancing Act

Individual scorecards for each channel, which align with ideal customer and business outcomes, will propel a quality programme to the next level. Getting to grips with these outcomes is, however, where the challenge begins. Here are three tips to overcome the challenge:

  1. Focus on customer outcomes. What drives customer satisfaction in each channel? Customer research, feedback and tools such as sentiment analysis will illuminate best practices to build into quality forms.
  2. Consider company outcomes. Which practices ensure compliance? Strategic goals, policy and script adherence requirements are also critical considerations. Add these into the forms too.
  3. Finally, combine the criteria and weigh up each performance standard. Not every standard will have equal value. Some are essential, others only recommended.

Of course, if every standard has a different value, scoring may take more time. Yet, a QM system will automate the process, remove human error and enable contact centres to add more scoring options.

For example, alternative scoring options include:

  • Zero-value questions for when a standard does not apply to the contact reason
  • Scaled answers for giving advisors partial credit
  • Key Performance Indicator (KPI) questions for separate evaluation

The final capability on this list enables contact centres to track individual or groups of questions as separate KPIs. Doing so to measure KPIs such as compliance can prove beneficial. Lastly, add a comments field into evaluation forms for feedback. Quality scores will then have greater context, bolstering the review process.

Managing Evaluation Forms

Following a process such as the one above is critical. As my colleague Mark Fargus, a Contact Centre Analytics Consultant at Calabrio, said “Evaluation forms are the key component, the heartbeat of your quality programme.”

So, how can contact centres continue to maximise the value of their evaluation forms? During his presentation at the recent Calabrio Customer Connect event, Mark shared some excellent advice to better manage these forms and add value to the entire quality programme. Available within most modern QM systems, here is a taster of these powerful practices:

  • Evaluation Responses – quality analysts can fill in a small tick-box before sending completed evaluations to advisors. By doing so, they filter each evaluation form based on whether it needs supervisor approval or requires advisor acknowledgement. Analysts can even choose to give advisors the right to appeal their score, ensuring a fairer QM culture.
  • Ad Hoc Tasks – bookmarking calls for review, evaluation and calibration simplifies the task of sending and assigning contacts for fellow analysts. Doing this, instead of sending contacts via email, keeps contacts secure and provides a succinct record of review.
  • Flags and Tags – good calls are extremely helpful when training advisors. If tagged by quality analysts, coaches can access these calls and share them with teams as a great example of what “good” looks like. Flagging calls for HR is also helpful.
  • Automated Call Selection – infused with speech analytics technology, the best QM systems enable automated call selection, for manual scoring, based on business objectives. By doing so, it increases call selection efficiency and reduces “cherry-picking”.
  • Set-Up Notifications – by either a desktop notification or email, this function alerts analysts when they meet goals, receive an evaluation appeal request and accrue an evaluation that requires approval. Advisors also receive alerts when their calls are evaluated, re-engaging them with the quality process.
  • Customer Metadata – metadata provides information regarding call characteristics that – through APIs – may channel into the ACD. Contact centres gather more metadata through the analytics technology within the QM system. For example, analytics may highlight the chosen first language of the customer and pass that information through to the ACD. Such an insight may improve future outbound campaigns.

Each of these features brings quality programmes into the new age. However, predictive modelling will push QM even further into the future.

The Predictive Model

Predictive analytics, embedded into QM software, allows contact centres to analyse 100% of contact centre interactions – regardless of channel. The technology fills evaluation forms automatically to enable this.

Of course, some standards within the evaluation form require human judgement, yet the software automates the rest. For example, it’s possible to automate the following criteria:

  • Did the advisor read the compliance statement?
  • Did the advisor ask for an account number?
  • Did the advisor thank the customer for calling?

Identify standards like these, which are compatible with analytics, separate them from the rest of the scorecard and start programming the criteria into the QM system. At the end of this process, contact centres can score every customer conversation and automatically calculate “spin-off” metrics. A compliance KPI is again an excellent example.

Just avoid viewing automated QM as the silver bullet. Manual call scoring is still necessary. However, a large chunk of an analyst’s job is complete, allowing them to assess more calls and spend more time sharing performance insights with advisors.

Finally, automated scoring may also enhance manual scoring by indicating which calls will provide more coaching opportunities.

Visualising Results

To obtain more value from automated and manual scoring, create a simple dashboard that analysts, team leaders and advisors can use to spot trends at a glance. Great QM systems have these systems built-in, which the quality team may customise to present insights of most value to advisors.

By collating these insights, QM systems create a complete view of advisor effectiveness. Within the Calabrio system, a feature named “Data Explorer” makes all this possible.

Mark Fagus explains: “Data Explorer is module agnostic. So, you can take data elements from different components of the system – such as quality scores and data from analytics. The latter enables advisors to access a word cloud of frequent phrase ‘hits’.”

“Also, by integrating the QM system with other tools, Data Explorer can access other data, such as WFM data. Advisors may then also access their schedule adherence rates, handling times and other metric results from the same portal.”

Analysts can also create their own dashboards, enabling them to spot individual and team trends. Such trends include:

  • Who are currently the best performers?
  • Who is improving most quickly?
  • Who needs more support, and in which particular areas?

These trends can then be collated in automated reports and distributed across the contact centre through recurrence features.

Extra Tip

Brad Snedeker

Brad Snedeker

Through integrations, bolster dashboards and reports with customer experience metric data.

By correlating such data with quality scores, analysts can test their evaluation forms, checking that they effectively measure customer experience.

Thanks to Brad Snedeker, Product Manager for Calabrio Quality Management, for this contributed article.

For more information about Calabrio - visit the Calabrio Website

About Calabrio

Calabrio The digital foundation of a customer-centric contact centre, the Calabrio ONE workforce performance suite helps enrich and understand human interactions, empowering contact centres as a brand guardian. Calabrio ONE unites workforce optimisation (WFO), agent engagement, and business intelligence solutions into a cloud-native, fully integrated suite.

Find out more about Calabrio

Call Centre Helper is not responsible for the content of these guest blog posts. The opinions expressed in this article are those of the author, and do not necessarily reflect those of Call Centre Helper.

Author: Calabrio

Published On: 28th Oct 2021 - Last modified: 2nd Nov 2021
Read more about - Guest Blogs,

Follow Us on LinkedIn

Recommended Articles

An Introduction to… Quality Management Technology
lady monitoring stats
30 Tips to Improve Your Call Quality Monitoring
Many stars on black paper background with focus on a golden one surrounded by a circle.
The Modern Approach to Quality Management
A quality concept with QUALITY stamp being used
Mastering Contact Centre Quality Assurance