Customer experience is a major area of focus right now—and for good reason. Companies that improve their digital customer experience see:
- 92% stronger customer loyalty
- 84% more revenue
- 79% less cost
Your customers are the only ones who can tell you how you’re doing in customer experience. Knowing as much as you can about your customer’s perception will not only lead to a better support experience, but drive retention, loyalty, and revenue. Today, the focus is on using the customer satisfaction score (CSAT) to determine how customers view your brand.
The customer satisfaction (CSAT) score is one of the most widely used customer experience metrics. With CSAT, customers are asked individually to quantify how they feel about an interaction with your company.
When used well, CSAT scores provide you with two important types of information:
- Detailed impressions of individual touchpoints
- Aggregated scores of customer experience—overall, by department, etc.
You can aggregate CSAT scores in whatever way makes sense. You should be aware of your overall CSAT score, but you may also want to compare scores based on certain factors. It’s often useful to compare CSAT scores at different phases of the buying cycle or by the service department if you think there are missed opportunities.
Why CSAT scores matter
Are your customer satisfaction scores really that important? Data suggests that 89% of consumers make secondary purchases after positive customer experiences—and the metric that shows you how your customers feel is CSAT scores.
It’s all about knowing your customer and adapting to feedback to exceed their expectations. This is a guaranteed way to build customer loyalty and drive longterm revenue.
CSAT scores identify your customer.
CSAT scores expand your knowledge of your customer base. When engaged through CSAT surveys, the customer’s answers allow you to deep-dive into their headspace. Consider what these CSAT survey questions indicate about a customer:
|Do you like your product or service?||It’s great.
It’s pretty good.
It’s not for me.
I didn’t like it.
|Did you have a good experience with the customer service agent?||They were very helpful.
They were somewhat helpful.
They were okay.
They didn’t help me at all.
They were very unhelpful.
|They were very unhelpful.|
|How often do you use our product or service?||All the time.
Just this once.
From these three simple questions, we can extrapolate that the customer likes your product, didn’t like customer service, and doesn’t use your product very often. A strategy based on these answers might point you towards increasing CX resources (customer service was unhelpful), upselling the customer with a new product or service that’s better suited for them (how often do you use our product), or asking the satisfied customer if they would like to be notified of other similar products or services (your product is great).
Pain points in the customer journey
Imagine your customer’s journey through your brand or company as a maze. A maze is filled with dead ends. Once your customer tries to navigate through the maze and comes to a dead end, that’s a customer pain point. The dead end could be a lack of responsiveness to your problem, a difficult or unwieldy website, or a frustratingly long wait navigating the IVR
All of these pain points need to be addressed and corrected if you want to increase your customer’s satisfaction, and subsequently, your CSAT scores.
Zoom out: Mindful alleviates some of those pain points, like offering a widget to schedule a call at the peak moment of frustration in a dead end. And giving customers the choice to receive a callback instead of waiting on hold is guaranteed to turn a pain point into a bright spot.
Learn more by measuring one experience.
The CSAT score is most useful when you ask customers about a specific experience. Each customer gets a questionnaire specifically targeted to a recent interaction, and they respond with their impressions of that interaction.
You can then aggregate answers related to similar experience—a request for support, a product return, and so on—and look for commonalities. Try to survey customers at as many touchpoints as you need, but make sure that each CSAT questionnaire focuses on that particular experience and not the customer’s overall impression.
Use CSAT scores to find problems.
The beauty of the CSAT score is that you can look at responses from many different situations and then compare results to find out what might be going wrong in an area.
For example, you may want to know where to focus your customer service rep training. Come up with a few hypotheses for where the problem might lie—wait times? rep knowledge?—and then test those theories by presenting customers with questionnaires.
Look for trends in the results, but also pay attention to outliers. You can boost your reputation by offering special thanks to very satisfied customers and smoothing things over when someone has a bad experience.
How to write a good CSAT questionnaire
The best CSAT questionnaires get the information you need in as few questions as possible. In many cases, you can get away with just one question: How satisfied were you with your customer service experience?
Give the customer a clear and intuitive framework for choosing their answer. Examples include:
- A numerical scale ranging from 0 (“very unsatisfied”) to 5 (“completely satisfied”)
- A verbal scale ranging from “very unsatisfied” to “completely satisfied”
- A series of face emojis: angry, frowning, smiling, grinning, etc.
- An even simpler thumb scale: emojis of thumbs down, neutral, or up
Which scale you choose depends on your brand guidelines. More formal brands might prefer more formal numerical or verbal scales, while more casual brands might go with emojis or pictures.
The wording in your survey questions is also extremely important. You should always ask yourself the following before compiling the survey:
|Question||Tip||Poor CSAT question||Good CSAT question|
|What answers am I looking for?||Only ask questions that are relevant to information you need. Don’t ask questions that don’t answer what you need from the customer. Short, direct questions, but ask yourself first: What am I looking for? And try not to influence the customer choice with your wording!||Did you like what we did to improve and enhance our product or service?||Do you like our product or service?|
|Is call center CX an issue?||Ask directly: Did you have a problem with the customer service agent? Did they handle your issue to your satisfaction? Follow up with an open-ended question: Can you tell us a little more about your interaction?||Did you not like how our customer service agents handled your problem?||Tell us in a few short words how we can improve customer service for you?|
|Did the customer have a poor shopping experience?||This is better served by multiple choice answers. Because so many things could have gone wrong, you may be asking questions about six different aspects of the experience and never get the right answer. If you don’t want an open-ended question, then give them a choice of answers (and make sure to allow them to choose all that apply).||Was the website too slow? Did you find it hard to seek support from our team? Was our self-help section too hard to navigate? Was your shopping experience less than desired?||What part of your experience did you have problems with?
a) Website too slow.
|Do I want to avoid an NPS survey?||NPS questions that ask if you would refer a product or business to a friend could easily be folded into a CSAT question.||How likely are you to refer our brand to a friend or colleague? (NPS specific)||After your last interaction with us, what are the chances of being referred to a friend or colleague?
e) Very bad
A few good rules of thumb when creating a CSAT questionnaire:
- Keep survey questions short, error-free, and grammatically correct. If the respondent thinks the question is foolish or annoying, they most likely won’t go any further with the survey.
- Focus only on subject matter you want information on. Don’t waste your customer’s time with useless questions.
- Ask one question, not compound questions. (Example to stay away from: “Do you find our agents helpful, and would you buy from us again because of them?”).
- Don’t ask leading questions (We spent a million dollars on customer service this year—did you like it?). The customer may lie because they now have other information factoring in.
If you’re presenting the survey in a written format like email or via a website pop-up, you can add a free-response comment form with a prompt like: Tell us more about why you chose that rating.
These free-response fields are useful when you need to look at individual responses and analyze them. For example, if you use a 0 to 10 scale and most of your ratings are 6 or above, you might be curious to know the reasons behind the odd “1” or “2” ratings.
You can also add more focused, multiple-choice questions addressing elements like:
- Amount of time on hold
- Friendliness of the representative
- Time to resolution
Be specific about what you want to know and ask as few questions as possible. When surveys get too long, more people drop out.
Calculating your CSAT score
CSAT measures whether the customer likes your product or service on a scale of your choosing, but it usually ranges in scores of 1-3, 1-5, or 1-7. Some companies use emojis or icons (thumbs up, thumbs down) to rate and then assign a numerical value to the icons.
CSAT is calculated by taking the total number of positive scores from your two highest ratings—e.g., 4 and 5 on a five-point and above scale, and, when using a three-point scale, use only threes. You then divide that by the number of respondents and multiply by 100. So, if you had 1,000 customers respond, and your top two ratings added up to 570 positive respondents, your CSAT formula would look like this:
(570/1000) x 100 = 57
In this example, the formula yields a 57% CSAT score. That means 57% of your customers were satisfied or highly satisfied with your product or service. Used as a base score of a customer’s happiness, it’s the first step in creating customer delight for your customer.
Incorporate other customer experience metrics.
But one caveat: A CSAT score is only one customer satisfaction metric, and, taken individually, it doesn’t give you enough data to strategize ways to delight your customer. Let’s say 57% of your customers like your product. Why did they like it? Would they tell their friends?
There are also some inconsistencies you should be aware of that can artificially impact the score. These include timing, response evaluation (what is a good to fair response), or sampling bias.
For example, you could only use the top-scoring responses in your calculations instead of the top two. This would definitely result in a lower rating. Many companies will use the top two scores because they represent overall positivity—you might see Excellent and Very Good as one score. So from our example above, you have 1,000 respondents and 570 positive responses, but only 400 of them were the highest rating. This means that your CSAT score would end up being 40 instead of 57.
Next, there’s timing. Responses to a CSAT survey are usually based on a specific amount of time. You’re asking about how a customer feels about you during a specific time period (onboarding a new customer, for instance). This CSAT rating won’t show you how the customer feels about you six months down the line, which is crucial if your relationship with the customer got better…or worse. It will only give you a glimpse of the customer interaction at the moment. This is why follow-up surveys (and other types of surveys, like NPS and CES) are absolutely crucial to map out a customer’s complete journey.
Sampling bias could skew your results as well. If you only send CSAT surveys to customers in a specific group (let’s say you’re selling women’s apparel, but you only send the survey to women who have bought shoes from your store), then the results will only show satisfaction/dissatisfaction with that particular group and not overall customer satisfaction with your whole base. Suddenly, you have hundreds of positive responses—and swimwear sales are plummeting. Sampling bias could give you less than helpful results from your CSAT surveys.
How to use the calculation to develop strategy
So now you have a CSAT score of 57 (from the previous example). Is that a good score or a bad score? What do you do with it? The answer will depend greatly on what you were initially trying to achieve with the survey.
If the survey was to gauge overall satisfaction with your product or brand, then a 57 means that roughly 57% of your customers (taking into consideration factors like respondent size, sampling audience, and question specificity) were satisfied. While most companies would love to see 100% customer satisfaction, that is rarely the case. Depending on your industry, most companies that see a CSAT score in the 80-85% range would be considered successful at customer satisfaction (although the pandemic has certainly skewed these numbers somewhat in the last 18 months). Having 80-85% of your customers satisfied is a good goal to set for your teams.
But you can also use the CSAT score to develop strategy by pinpointing specific areas of your business that may need improvement. Questions about your product or service are helpful (and inspirational if responses are positive), but you may want to query customers about how customer service is doing, how the website is performing, or even ask for their suggestions for improvement. Everybody has an opinion, and some customers would love to share it with you. Making actionable items out of customer responses could help your business grow with your customer’s needs as a priority, instead of throwing pasta at a wall and seeing what sticks.
How to communicate CSAT scores to the team
There’s a team behind your customer experience department. This team is as important as any other team in your company, including sales. They’re the first responders in your competitive landscape, and they should always have the map of your customer’s journey in their hands. But while most companies tout their commitment to investing in robust customer experience strategies, few actually do. Customer experience should be the focus for most companies, because, as Gartner reports, 81% of companies believe that they will compete primarily based on customer experience.
Get everyone on the same page and aimed at the same target.
Make sure that everyone is aware of what your CSAT scores are, how they’re calculated, and what they might mean to the company. This includes teams from development, support, marketing, and sales. It’s important to have cross-team accountability for CSAT since many CSAT questions may be directed at one of these specific departments. If Sales and Support are aware of an issue that involves them both, they may have a solution to correct it. Having a company-wide discussion about CSAT is always a good idea.
Focus on what’s going right.
In the same vein, these meetings should not be an opportunity to denigrate or demotivate any one department. When customer service falters, that falls on the shoulders of all teams. Marketing efforts may be weak, Sales may be ineffective, but Customer Service and Support will always bear the brunt of it—right after the customer. You should always focus on what’s going right first when presenting CSAT to the team. Start with a motivational edge that inspires them—and do not let finger-pointing and a shame/blame cycle take over your teams.
Treat challenges as opportunities.
Be prepared to deal with the negatives—in a positive light. Give all teams ownership in the shortcomings of the CSAT scores. Ask the teams if there are any suggestions for another team to improve their part of the problem. For instance, if you have a high CSAT for customer service interactions but a low CSAT regarding web usability, ask the CX teams what the customers are saying the problem might be and work together collaboratively to come up with some suggestions on how to improve. The development teams may have far less interaction with customers than the CX teams do, so feedback from the CX teams may be invaluable to the development teams.
The CSAT score is just one step toward customer happiness. Overall satisfaction requires a positive customer perception across all touchpoints—not just of how you resolve an issue.
Mindful can help you create that overwhelmingly positive experience with callback, which gets customers off hold by letting them request a call when a rep becomes available.
Mindful also connects the customer with automated notifications, which confirm call times and remind customers when a call is coming up. Customers don’t waste time on hold, and you still get to provide the stellar service you want to offer.
Ready to give your customer loyalty a boost? Try out Mindful today and see how easy it is to get started.