Ridley Scott’s 2007 film, American Gangster, told the story of Harlem drug lord, Frank Lucas. Frank, played by Denzel Washington, confronts Nicky Barnes about the quality of the product he’s distributing in this scene. Nicky isn’t exactly receptive to this feedback. His focus is on short-term profits while Frank takes a longer-term view of the business and customer perception of the brand. Quality, be it the quality of a product, or the quality of service are huge components of brand reputation.
For any product or service, ensuring consistent, high-level quality comes at a cost.This is also true of contact centers. New advancements in technology are changing the financial cost of an effective quality assurance program, while also presenting new value to the businesses who care about how customers perceive their brand. ApexCX Sr. Consultant, JD Fairweather, brings his expertise to this conversation about the evolution of QA in contact centers.
We discuss:
Connect with JD on LinkedIn
Music courtesy of Big Red Horse
Rob Dwyer (00:02.853)
Welcome back everyone. Thanks for joining another episode of Next in Queue. Today I've got JD Fairweather in the house. JD, how are you?
JD Fairweather (00:11.522)
I'm doing well, yourself?
Rob Dwyer (00:13.284)
I great. I cannot wait for us to talk talk about this topic. It is a topic that I wish I talked more about on the show, but people would probably get bored. But you were like, hey, let's let's talk about quality. And so I'm excited to do that. But before we do,
JD Fairweather (00:15.31)
this video.
Rob Dwyer (00:41.344)
We should probably get to know you a little bit. You are a senior consultant with Apex CX. Tell us a little bit about you and a little bit about Apex CX.
JD Fairweather (00:52.03)
all right. this would probably be the most boring as part of the podcast talking about myself. I'm an introvert. So I tend to shy away from these types of conversations, but, JD Fairweather, I've been a consultant now for 17 years. so I'm, I'm, I started really early. So I'm, one of the very few consultants out there that, you know, didn't start up in their latter years of their career. and because of that, I have a vast experience.
Rob Dwyer (00:55.948)
Ha ha ha ha ha!
JD Fairweather (01:22.092)
with a lot of different industries. More importantly, a lot of different technology changes that has happened over the last 17 years, things that have transformed the contact center. So the value that I bring to the clients with Apex CX is that I can assess things from a point of view of where things used to be and also be able to help them to adjust to things that are to come. I'm very technical.
Naturally, I grew up around computers. I spent a lot of my time doing coding and stuff. I actually went to school for coding and ended up in the contact center kind of working my career through the opportunities that were available to me at the time. So because of that, a lot of people think I'm into computers. So when I tell them no, now I'm into customer experience and primarily with contact centers.
Rob Dwyer (02:14.348)
Ha ha ha.
JD Fairweather (02:19.246)
It usually comes as a shock because I kind of know the back end of fairly well.
Rob Dwyer (02:25.63)
Okay, so first machine that you had at home or worked on on a regular basis, what was it?
JD Fairweather (02:31.406)
Ooh, wow. I'm going to see a Commodore 64. But yes, absolutely. The big cartridges, the tape attachment that you connected to it.
Rob Dwyer (02:35.87)
yeah, Commodore guy.
Rob Dwyer (02:45.084)
Mm hmm. Press play on tape. That's when I first started playing Oregon Trail on a Tandy machine. was that was the how you got going after you write it would prompt you to say press play on tape. Kept moving before you died of dysentery so.
JD Fairweather (03:06.498)
But one of the most rarest machines that I've worked on, as I've used or had in our house, we had a Xerox machine, a computer, Xerox made a computer at one point, and so did Kodak. Kodak made a computer that we had at our house as well that my dad used to tinker with as it aged out.
Rob Dwyer (03:18.364)
wow.
Rob Dwyer (03:24.556)
That's awesome. Okay, so let's talk about and I'm going to take you down memory lane. What's the first? Project coding project that you worked on. As a as a young man, what what was that? What did you make that did something? What language were you using?
JD Fairweather (03:46.83)
Language that I was using is basic a yes But the first thing in the first Thing that can remember coding and I was actually having a conversation with one of my elder's siblings about this is there was this program that you did on the Commodore where you made a balloon and it went across the screen and
Rob Dwyer (03:53.216)
He's sick on the Commodore? Yeah.
JD Fairweather (04:16.046)
Even though it's probably the smallest one of the coding projects that I've worked on, it was one that really caught my interest in seeing something that you typed in and you're watching. it actually was, there was a manual that came with Commodore 64. It was a program that was in the manual to help you to learn how to code for the Commodore 64. But it's that one that really kind of sparked my interest and was one that made me see the capabilities of what
you know, what programming could do.
Rob Dwyer (04:47.338)
Yeah, gotta love those VGA graphics. Those were the days. Those were the days. Well, you know, one of the important things about coding is ensuring that you're doing things right, that you're meeting certain standards, and that holds true of all kinds of other things, including in the contact center. And quality is what we're here to talk about today. And that really is this process of
JD Fairweather (04:50.06)
yeah, absolutely.
Rob Dwyer (05:17.482)
determining whether or not you're adhering to kind of the established experience that we want.
take me back to kind of what your first experience was within, whether it was a quality organization or a quality process and when that was and what did that.
JD Fairweather (05:39.854)
Well, like most individuals that end up consulting for contact centers or in customer experience, I started off in a call center. And some of the experiences that I've had started starting off really on the sales side of things was the more you had a pitch that you worked on or that was created or crafted and you went through it over and over again, perfecting it until, know, it kind of
It met most of the customer needs. It reached and touched most of the customers. It took a lot away of the flexibility that you had in that kind of dialogue that you have with a customer. But it taught me the structure that is needed within a conversation for it to reach success. And I think that is probably where that quality and understanding that there's a structure of a conversation that leads to success.
And then there's different ways that you can piece it together so that it's your own. And I think that's a component of quality assurance a lot of people miss.
Rob Dwyer (06:47.946)
Yeah, I absolutely agree. I've talked about this on the show before, but my first experience though, short-lived in a contact center was actually outbound credit card sales when I was in college. And, you know, there was a literal pitch book that you were given. It was laminated with sheets. And so when a customer would raise an objection, you would flip to that sheet and it
give you some scripting on how to overcome that objection. I absolutely hated that job as I'm sure a lot of people who tried it out did. It was, you know, calling people in the middle of dinner sometimes at night because it was an evening gig. And so you'd get some people that were not exactly thrilled about.
Calling them trying to get them to sign up for a credit card. But what it showed me was that when I followed a process the results were somewhat predictable and I I knew that no matter what was said I had I had a response a response that was thoughtful and crafted by someone who had
experienced this for a long time. The challenge was how do I get quickly to that response, which was a challenge with those little flip decks that we had. But you're right, that structure piece, though I didn't necessarily recognize it at the time, was critical because I didn't have any type of sales background at that point in time. I had
I had never really sold anything in my life. I had maybe taken some orders working at the Dairy Queen, but I hadn't sold anything. And so it was a real...
JD Fairweather (08:52.974)
Would you consider yourself that you were good at it? Did you struggle?
Rob Dwyer (08:59.208)
I actually was good at it. was good at it very early on. remember, you know, just like any contact center, certainly at the time, you know, you show up, you've got a class of new recruits and you kind of go through and, and practice. And I remember kind of out of the gate, people were like, Rob's really good at this. I think though that that had come from a lot of previous experience with public speaking.
And so I was probably more comfortable just speaking in front of a group of people than most everyone else. And so that probably skewed people's perception because a lot of people might be really good on the phone, but standing up in front of a group and doing this kind of mock call can be very intimidating to a lot of people. So I wouldn't say...
I was necessarily better than anyone else. think given the mock call arena that I was put in was less intimidating for me than it probably was for a lot of other people. How did you feel the first time?
JD Fairweather (10:16.002)
Well, heading into sales, because the first job that I had within a contact center was actually in collections. And collections is, it is tough. It throws you into the deep end and it never throws out anything to help you. And so it was definitely a stepping stone that allowed me to acclimate myself better within to a contact center than if I had just jumped in directly into sales.
Rob Dwyer (10:24.06)
brother, that is tough.
JD Fairweather (10:42.924)
because of all the experience that you get from trying to claw money back for organizations, the steps and the process that you go through kind of teaches you the customer experience and the sales aspect of it as well. But sales, because again, I'm very technical, very structured in thought and in practice, and so coming up with the script,
putting it together and then just fine tuning over and over that script. I noticed that I could push myself above what some of the others that had that kind of natural gift of gab and the ability to position products in a very creative way. I could do it more by putting it and laying it out into different methods or processes I could walk someone through. So I did find success. I don't think I found success as everyone did.
Rob Dwyer (11:34.56)
Mm-hmm.
JD Fairweather (11:39.014)
Like I said, more people were comfortable with just talking with people and positioning products. Again, that introvert side of me looks at things slightly different.
Rob Dwyer (11:50.432)
Yeah, I know exactly what you're talking about there. So let's talk about then quality as it relates to the C-suite because I think actually let's back up. Let's talk about how quality has been executed probably for the bulk of the time that you and I have been involved in contact centers.
What does that quality process often look like knowing that there are a lot of variables?
JD Fairweather (12:25.816)
Yeah, definitely a lot of variables. now or through, like you said, period of time, 17, 20 years of contact center. A little bit more than that. 20 years of contact center. 17 years of consulting. It has gone through some changes, but there's been the basics of it, is there's a conversation or an interaction that is recorded.
That is reviewed by an individual who is evaluating it based off of some metrics that have been determined to be impactful, typically more so for the agent than for the overall conversation, which is a problem. And then at the end of that, you're grading or you're scoring it based off of certain criteria that is determined usually by executives or some upper level management.
Rob Dwyer (13:08.725)
Mm.
JD Fairweather (13:23.086)
based off of what the outcome or the desired outcome is for the overall experience of customers. That process, it has gone through a lot. At one point, you weren't able to do screen captures and a lot of the more technical aspects of it that you can do today. And then even today, now you got the AI and you're able to evaluate a lot more. But for the most part, we're still talking about looking at a call
identifying the areas that has shown to prove and to have a good experience or to have a better experience and making sure that those areas are touch points. Having the guidelines met is huge. I'm not a big fan of adding too many guidelines into the quality assurance piece because it's again, it only focuses on the one side of the conversation and needs to have some balance. Guidelines are good, but not the entire
guidelines of the contact center crushed into your QA form, and then utilizing that structure to find out where you are as an organization, the service that you're providing for your customers.
Rob Dwyer (14:34.956)
So you just brought up something that I want to dig into and that is the focus of QA being on the agent. Tell me more about that.
JD Fairweather (14:44.78)
Yeah, that is, you know, it's an easy way. Man, I'm trying to think of because it can be the Swiss Army knife of performance across the contact center. But I think it's more of the Swiss Army knife of the performance of an organization because you are literally listening to the experiences that your customers are going through, the challenges that they're having. So, you know, that is...
this channel of the contact center allows for you to evaluate and to experience what they're experiencing based off the call that they're calling in for. When it's used or how it's used in a lot of contact centers is to more so evaluate the agent to determine the performance of the agent and whether or not they're meeting certain criteria that is determined by management. The problem with that is that it's
It's the headcount issues. It's very taxing on an organization to hire or to build out a quality assurance group. So what typically happens, you have a very small QA group and they're listening to very few calls. And then you're using those very few calls to determine the performance of someone. So a lot of times you end up with this metric of you listen to two calls per week and then you grade
the agent based off of those two calls. But those could have been like literally their worst calls. It's such a toss up between one experience that one customer has had with that same person at the beginning of the morning even sometimes versus what they've had at the end of the afternoon. So if you only utilize those few experiences or those few interactions,
Rob Dwyer (16:27.07)
Mm-hmm.
JD Fairweather (16:39.15)
You can really press down on agents on things that they're not really having troubles with instead of having a more total experience or total view of what the service that agent is providing.
Rob Dwyer (16:55.766)
Yeah. One of the challenges that I've seen and having led a quality organization with people who do exactly what you're talking about, right? That they just listen to calls all day and they're scoring them. There are some inherent challenges when you're relying on people to do that because you've also got some metrics that they're trying to achieve, right? You mentioned like two calls a week and that may be the metric that you're holding them to.
But as a person who knows, I've kind of knock out X number of evaluations every single week based off of the headcount that maybe is assigned to me and I know I've got to get to X.
Then I could maybe break that down by day and break it down by hour. And I know I've got to get through X number of evaluations every.
I am going to often. Take into account how long an interaction is because. There's a big difference between scoring a call that is three minutes long and scoring a call that is 13 minutes long. When when you start to scale that over the course of a week, if I add 10 minutes to every single phone call, I'm probably not going to be able to get through them all. So as a person, what do I do?
As a quality evaluator, I probably listen to shorter calls. Now there are ways around that, right? Maybe you have calls that are randomly assigned and so you get some randomness there, but then you still end up with someone who is trying to move through these interactions based off of time constraints. And so that's an inherent challenge.
Rob Dwyer (18:49.6)
Then you get to the sample size problem, which is, maybe I take 20 to 40 calls a day. You scale that out over a week, and then you're grading me on two? That's what I get? That can absolutely create some problems.
JD Fairweather (19:12.096)
It can. I probably should have perfected this conversation with that, you 12 years ago. So 17 years I'm consulting, but 12 years ago, I noticed that a lot of the organizations that we were working with did not have a QA internal group. And even at times when they did have a QA group, that QA group was spread out across different tasks and roles. So they were unable to evaluate a lot of the conversations or the interactions. And
So I decided to put together a small QA business utilizing my experiences within the contact center, as well as this idea that had that this was quality measurement as far as QA within the contact center goes, I believe is more of a behavioral type of function. And because of that, I thought that you should absolutely seek someone who has that kind of
a psychological behavioral experience who's studied that form of understanding of what's going on through the heads of the customer, what's going on through the heads of an agent, and being able to evaluate utilizing that experience as well as utilizing a form that was structured to pinpoint those areas where, you know, what's going on with the agent, what's going on with the customer.
And so I set out to do this. And so we hired a group of individuals, mostly college students that were working through their psychology major, to be able to do these evaluations. We gave them leeway with the permission of our clients to look into different things. Like they've had projects within their school that, based off of reviewing these interactions, they will be able to run their own little projects.
One of those projects with one of the clients that we had was
JD Fairweather (21:13.558)
Identifying how an individual responds based off of accents You know there was that was the big thing going on at that time outsourcing whether or not customers had a negative experience just based off of someone's accent so how did they treat the person were they treating people differently and We would do these types of experiments and they would grow Specifically with this client we started off doing accents
It didn't provide that much of a gap as we thought might have been there that people were assuming that were would be there. What was greater than just an accent was whether or not the situation was resolved. If the situation wasn't resolved and there was an accent at play, then the customers was more frustrated because they felt like someone did not share that experience that they had based off of the accent. So it was kind of.
Rob Dwyer (21:59.275)
Mmm.
JD Fairweather (22:05.87)
of a bias of location, geographical bias of, know, if you're not here within the United States, you don't understand what I'm going through or what I'm feeling. And a lot of times that's correct. know, there's a difference between the cultures in the different regions of our globe. And because of that difference, you might not understand how someone feels about a certain product or about a certain service because you don't really experience it in that way.
But for the most part, was, it came down to, you resolve my issue? You resolve the issue, no issues with, you know, with an accent or not. But, you know, if it wasn't resolved, then a bigger issue. We moved on to that from there. They've also, you know, some individuals also did based off of gender, where, you know, whether or not, you know, speaking to a male or female, how did that conversation go? Were you likely to find these
Rob Dwyer (22:35.66)
Hmm.
JD Fairweather (23:05.326)
back and forth arguments that can happen on a call based off of that because there was you know this one guy who seemed to always get into an argument with customers and and We were trying to find out you know whether or not that was put me the factor We saw some of that so there were things that Based off of looking at it from the lens of kind of like the behavioral sciences of you know how the customer feels I'm really looking into sentiment
Then we talk heavily about sentiment. These days, no one was talking about sentiment. Then being able to provide our clients that type of service was of great benefit to them and to us as well because it gave us a lot of insights. Every once in a while, there were setbacks based off of the data that came through. Sometimes we'd ask for permission to run even on that one with the male and the female.
When that information is then delivered back to a contact center, like, you gave us permission to do this experiment. Here's what we found. It's not always welcomed, because these are things that impact how you hire. It impacts whether or not there are issues within your organization that would have to be dealt with that some organizations just really don't want to deal with.
At least not through that channel, not through the contacts down there. it was HR bringing these things up, it'd be a different situation. But you know, with from a contact center and because you're dealing so directly with customers, I think that and we're going to see this a lot going forward, especially with now with AI doing a lot of these evaluations and what they're able to evaluate. There's going to be a lot of these little nuances that no one thought was happening.
that we'll be able to detect based off of being able to review all calls instead of just a handful of them.
Rob Dwyer (25:05.322)
I so you talked about kind of how that information is received. And I wonder if that speaks to what organizations are often focused on when they're trying to run quality. Like what is the goal? And you talked about this already, right? The goal is often agent performance. That's what I care about.
How is the agent compliant with these things that I set out? But there are all kinds of other goals that you could utilize quality for the question is Are organizations willing to pursue those goals or use quality to pursue those goals? Can you talk about some of the other things that? Quality can help you
identify, achieve, understand in ways that other functions of contact centers simply aren't equipped to do.
JD Fairweather (26:12.354)
Yeah, I think in previous conversation that I had, I said I wasn't going to get too much into AI. I've been kind of digesting a lot of AI over the past year. And so I can go into a pretty long rant about the functionality of AI and how it plays it within the contact center. But I think, I believe that the transformation of what quality assurance was before and what it's about to be is extremely drastic.
And that is before, you know, again, we talked about only being able to monitor two or three calls per agent. And now you're having this ability to review all of the calls. That is, it probably makes quality or QA monitoring the process of doing that, the single most important metric within an organization.
Rob Dwyer (27:08.373)
Hmm
JD Fairweather (27:08.686)
Because it's the only one you can't you can send out, you know millions of surveys and you know because of survey bias you're only going to receive So many responses back to them and those biases also affects how that person responded where as being able to listen to the interactions of every person who's had an issue or has a question or you know, we
This we've wanted to do feedback surveys for you. And they usually end up going nowhere. You know, put out these feedback surveys, you put them on your websites or you put there, you have it, you know, kind of built into the process of your customer service. But that feedback is never really captured in any meaningful way. And the data is never properly shared with the departments. But now you're able to capture every feedback you're able to capture.
Every complaint you could you're able to capture every issue with your product you're able to find out you know What is you know, there's a? There's a process when you're doing QA where you drop the bottom number of calls So if a call is under three minutes, you don't listen to those because they have you know It usually does not meet all of the requirements of the form But now you're able to listen to all of those you're able to listen to the 12 to 15 the 20 minute conversation and find out what how
What was it that that customer is going through where it required that amount of attention? You're able to pick up sentiment. You're able to identify How many customers call into your center happy and leave unhappy or call in unhappy and leave happy? There's just so much that you're able to to evaluate that I don't think that any other metric even matters after this
is implemented in a proper way. If you think about average handle time, all of the telephony metrics, those are only a value if you know what happened during that interaction. And now you're able to tie back that into the interaction. Why is the average handle time six minutes? Is it good for my business? You're able to identify those things based off of a good quality assurance form, as well as utilizing some of the other metrics to pull out the stats portions of it as well.
Rob Dwyer (29:33.91)
Yeah, there is just so much more that you can do and find quickly today that honestly would have been pretty impossible. Not even, I mean, five, six years ago would have been really hard. were some companies that were on the forefront of this that were working on it, transcription quality.
had a lot to do with some of the challenges. And now certainly with AI in the middle, it has changed the game. But to your point, there are some really granular things that can make huge differences for organizations. And I'll just talk about an example to someone I was recently talking to. One of the things that we can do is kind of identify negative.
reactions that are happening at the end of conversations, right? So customer leaving angry, upset, frustrated, what have you. And the value of a negative survey on Google has a huge monetary value because really what it boils down to is on Google, right?
A lot of times I'm going to look at things by how highly they're rated. And so if someone's 4.6 and someone else is 4.4, who am I calling first? I'm calling 4.6. Well, if you can catch people and do that service recovery before they go on Google and throw you a one star review, that's not just worth the
the service recovery of that particular customer, it can impact potential leads coming into your business. And those are quantifiable and they can be a lot of really high value leads. Quality, the way it was executed five years ago, doesn't do that. It just can't because you can't scale it. If I'm picking two calls a week per agent,
JD Fairweather (31:48.984)
No.
Rob Dwyer (31:53.418)
That means I'm missing more than 95 % of their interactions. And it could be, to your point, it could have been a really short call. It could have been a call where the customer called, they quickly vented and said, I'm never doing business with you again, and hanging up the phone. And by the way, today, if you want to find calls where someone says, I'm never doing business with you again, you can do it like that. It is that fast. And so,
JD Fairweather (32:18.914)
Yes.
Rob Dwyer (32:22.004)
It's a way for you to quickly identify what are the products, the processes, the services, the breakdowns that are leading to those types of interactions. So then I can turn around and go fix it in my business, or I can fix it with the person who is causing those breakdowns, whatever the case may be.
That wasn't really a thing before.
JD Fairweather (32:51.032)
Yeah. And, you know, again, the impact that has, the change that that has, I think it takes it from being one of the bottom metrics that is looked at by executives. If I'm brought in on a project with a client and I'm sitting down with high level executives and I say, let's listen to your calls. Let's see what's going on within the contact center. And so you queue those up.
You ask them not to cherry pick them and you play them back and you watch the face and the reactions of these execs as they listen to the experience that their customers are having. Their eyes light up. You know, they're you know, they're calling people in there. You know, they're having that that Wilson Fisk moment off of Daredevil where he's no fussing with everyone. Get everyone in the one room and just, you know, custom all out and to go out there and fix this.
Because it's an experience that you can see is tangible. It's not numbers. It is the actual experience of an individual that they're having with your organization. But the problem, like you mentioned before, is that it didn't scale well. It scales horribly. If you really want to be able to do evaluations, you probably need one evaluator for every four agent. And I've been in reviewing organizations, even one recently, where they had
Rob Dwyer (34:05.526)
Mm-hmm.
JD Fairweather (34:19.863)
three evaluators where they had 200 plus agents. And they even then, we even were just three, just those three. was called into, I wasn't called into a meeting. I was speaking, I wanted to speak to someone who was responsible for providing me analytical data. And that person was, it was a Monday and they were like, my plate is slammed today. I cannot help you today. I'm like, what's going on? What's going on with,
with your day that would pull because I'm that person was kind of given to me at the point of contact. It was like, well, sales came in over the weekend and they were bad. And because of that, you know, the CEO wants to report on the desk immediately of what happened. And this person is just an analog person. They're just pulling data off of the servers, trying to piece the numbers together to see what happened. I'm like, OK, I don't think anything about it. But then on my in my email, I get an invite to this meeting.
It said I'm only there for to observe but I can tell it's almost good to ask me a question So, you know I go to this meeting and They're going back and forth, you know what happens to the numbers. There's you know, lot of like well, could have been this it could have been that and I you know I really didn't want to get myself involved in it because it's kind of outside of the scope of what I was in brought in for But I asked the question, you know, what is QA saying? What did the QA team saying happen?
and it went silent and they were like, well, the QA team is working on something else. And I was like, well, that's the issue. mean, this data analyst, he's gonna be able to pull numbers the entire weekend, weekend previously, and he's gonna be able to see the trend and he's gonna be able to see the result, but he's not gonna know what created that result. But if you listen to 20, 30 calls, just 20, 30 calls of what happened over the weekend,
Rob Dwyer (36:01.078)
Mm-hmm.
JD Fairweather (36:16.736)
likely you're going to be able to pinpoint that specific issue that was affecting either the team or your customers at that time that dropped your numbers so drastically.
Rob Dwyer (36:27.914)
Yeah, yeah, it's all about the why. When you talk about scale, I actually was just talking to a contact center. Group of leaders last week and I was actually shocked. They've got roughly 300 agents and they said they QA 10 to 15 % of their calls and I was like, wow, you guys are rocking it out. Their problem though is they're still growing.
They anticipate being roughly 500 agents by end of year, and they expect to be a thousand agents end of next year, and they can't scale it anymore. They're like, we want to be even higher than 15%. We want to score a hundred percent, but they recognize that that is not financially feasible because you're putting so many people into a position of
just listening to calls as opposed to putting people in a position where they're, in this case, driving revenue and doing the work of the contact center, which is interacting with customers. And so I think you're right. The ability of technology today to fill that gap, which has long been a gap, is
incredible and it provides you with a better picture because I'm not cherry picking. I'm not relying on such a small sample size. I am able to see trends and I'm able to see on a particular interaction.
Maybe it's JD, right? And I can see for JD, yeah, he didn't do this very well on this particular interaction, but it's an outlier. Because I can see the data on 100 % of JD's calls that says, no, like 95 % of the time, he does a really good job on this. This happens to be an outlier. And so from an agent performance, maybe I can identify what's the reason this is an outlier and address that as a
Rob Dwyer (38:42.572)
opposed to addressing this generic behavior. But then I can see all these other things that are so high value to the organization. I guess my question to you is, today, given all of that, the problems that we know exist with QA, the advancements that have been made with technology, what's the?
pushback from C-suite leaders? Is it that they don't know? Is that they don't want to invest? Like, where do you see that today from implementing a solution?
JD Fairweather (39:21.068)
Yeah, it's one of reasons why I got out of the QA industry business, right? I haven't done QA probably another about five years now because it was a hard sell getting them to understand like you said 10 to 15. I wanted to go up to maybe even as high as 20%. I feel you know, that's a good sampling of what's going on in your contact center. If you want a true metric that you can say this is is closer to the absolute versus you know an estimate.
But 10 to 15 is really good. I mean, I've seen it as low as like 5%. And it's really hard for them to understand putting that amount of headcount to monitoring or to evaluating and listening. And what would typically happen is even if they were doing it in-house, they would put that pressure or that responsibility on the supervisors. And the supervisors weren't able to do anything else.
And even if you brought in an organization like myself at the time to relieve them, then you had all this data that was coming in and you had no way to apply it because supervisors were too busy doing other things and putting out fires to really utilize that information in a meaningful way. I would love to see the numbers on the amount of coachings that actually happen within a contact center.
Rob Dwyer (40:45.078)
You
JD Fairweather (40:45.41)
that they believe is happening within a context. Because it's just, the supervisors, it reminds me of a little bit of the challenges sometimes within our educational system, where you have one teacher, have 30 students. And a lot of times that's exactly what happens. You have one supervisor and you have 20 something plus individuals that you are supposed to be motivating and you're supposed to be coaching and you're supposed to be developing.
And because of how busy it is to then be doing those things, going over the metrics, trying to improve performance, it's just one of those things that no one can find value in these QA form scores where you get these scores and now you have to sit down with someone one-on-one and say, is the areas and opportunities that you have, here are the challenges that you might have had.
So it's a lot of time. There's not an easy groove that it could just slot in into what is the current contact center. If there's not someone from inside that is championing, know, these quality scores are important. We need to have to keep a team that is dedicated to it and review this information as the insightful data value that it is.
Rob Dwyer (42:06.314)
Yeah, it is, I think too easy to get caught up in the putting out fires and the contact center and not thinking strategically, not acting strategically to do the things that we know will help performance in the long run. And we also do ask a lot of supervisors.
Like that frontline, whether you call them team leaders or supervisors, whatever we ask a lot. sometimes the spans to your point are so big that it's hard for them just to get through a day. Putting out the fires much less thinking about, how do I, how do I help my team perform better in the long run? Like they're, they're just busy maybe taking escalations and
dealing with PTO requests and like who knows whatever else is on their plate because it's a lot.
JD Fairweather (43:13.25)
That's, think, is one of the main issues right now that they're facing. Not too much talk of AI. But one of the issues that they're facing with a lot of these implementations of AI within the contact center is it's built about on efficiency. And they believe that they're going to go in and they're going to make the contact center more efficient. But the contact center is already very efficient. They already have workforce management tools. So they know what every agent is doing every minute, every second of the day, including breaks and lunches.
I mean, can you imagine them implementing workforce management in a large organization that has marketing that's in the hundreds? No one, everyone would be balking at that, having to be monitored and evaluated on that sort of minute detail. And then as supervisors, like you just mentioned, you have all these other tasks and responsibilities that you have that even if you were to implement something that was highly efficient for a supervisor utilizing AI,
they still have so much work that fills into those gaps of things that they are unable or weren't able to accomplish before that you're not going to see that those gains by that implementation of AI is going to get swallowed up by all of the other work that they're responsible for. And QA is one of those that has suffered the most in organizations based on this. So if you can take that portion and get it to where it's almost a science of doing the evaluations, scoring the evaluations,
providing each person that it touches. So you provide the agent with a summary of the evaluation. You provide the supervisors. You provide a manager with a report. You provide the directors with report. You provide the execs with an overall understanding of these sentiments and the challenges. And everyone gets what they need without laying that responsibility on one person, a supervisor, or a manager individually.
I think that's the way forward. And then you'll be able again, like we spoke on earlier, to see the value of that data of having, you know, every every conversation, everything that's even going on in the background of your callers. Like some callers are rushing you off the phone. But if you heard in the background that they have an intense household, then you would understand how to better treat those customers. I think, you know, through that process, you're not only.
JD Fairweather (45:36.546)
deal better with your QA, you could help agents handle situations better if you're doing this kind of augment assistive AI within your centers.
Rob Dwyer (45:46.132)
Yeah. Man, I feel like you've provided a great commercial for Happitu just going to give this shameless plug there. If you're looking for automated quality assurance, talk to me. I'd love to see if we'd be a good fit for your organization. But if you're looking for help with your contact center, mean, JD, you've worked with some of the largest brands in the world. And certainly it doesn't matter how big.
your contact center is, you can handle it. You can come in and help them understand things that they might be able to improve. look, if you're considering an implementation, we're talking about quality here, and you're a leader in an organization, and you're thinking to yourself, I know I need to do this, but I don't know where to start. You can help them with that, right?
JD Fairweather (46:42.028)
I can and what I would love to pitch right now to anyone listening is for these startups. These startups who are out there and they're creating these AI services, reach out to someone, talk to someone who's been working within a contact center for years. I I've seen so many of them having setbacks based on the fact that they don't understand the obstacles and the challenges within the contact center. They don't understand how it's intertwined within an organization.
They don't understand where knowledge is created and where it's transferred from. So they are putting out these products and these services based only from one point of view. And that's from the customer point of view that they had. And instead of understanding that, this is things that have been developed for decades in the making. These are things, these are processes that are still there from the early 90s. There are things that are, there's some software that's still plugged in.
Rob Dwyer (47:38.188)
So the early 90s were not that long ago, JD, right?
JD Fairweather (47:41.582)
Yeah, no, but you should not have that software plugged in from the early 90s and they do. Right. know, early 90s, you should have been transitioning from, you know, the handset on your desktop to, you know, something a little bit more network based. Now, we didn't move into the clouds at that point, but there were definitely more of a networking that happened on the telephony side of things.
Rob Dwyer (47:51.532)
Yeah.
JD Fairweather (48:11.35)
And they don't understand all of those nuances. They don't understand it. You one of my big arguments in discussion these days is that the contact center was created because there was a problem with being able to scale customer service. And if you want to really scale a customer service using AI, you need to go back to that route. You need to go back to that experience that was had when there was only a few founders and those few founders answered every call. They handled every sale.
and they got to know the individual customers that they were working with very well, and they provided or aimed to provide the best service possible. And then from that place, you start to scale. You don't scale it from what the contact centers is today, because that was scaled off of an issue that was created from that very moment at the start of an organization.
Rob Dwyer (49:05.248)
that. Connect with JD. You'll find his LinkedIn down in the show notes. Apex CX. JD, thank you so much for joining Next In Queue. I love this conversation.
JD Fairweather (49:16.291)
My pleasure. Enjoy this call.