Disloyal customers are costing businesses billions. But what actually triggers disloyalty? Former CEB Global’s research (now part of Gartner) explained that the level of effort consumers put into interacting with a brand directly impacts loyalty levels. In fact, according to CEB Global, around 96% of consumers who reported having difficulty solving a problem were more disloyal.
How do you know how easy it is for your clients to interact with your business, though? Well, that’s where Customer Effort Score comes into play.
What Is Customer Effort Score?
A general Customer Effort Score definition describes it as a type of customer survey that measures how easy it was for a client to interact with your business (solving an issue with customer support, making a purchase, signing up for a trial, etc.).
Consumers are generally asked how they agree with a statement (“The company made it easy for me to solve my problem”, for instance), to rate their level of effort, or just to answer a question (“How easy was it for you to solve your problem today?”, for example).
CES Survey Types
There are a few metrics you can use to measure your Customer Effort Score, but keep in mind that they can change the way you calculate and score surveys:
- The Likert scale – This method involves a “Strongly Disagree/Strongly Agree” scale structured as such: Strongly Disagree – Disagree – Somewhat Disagree – Undecided – Somewhat Agree – Agree – Strongly Agree. The answers are usually numbered 1 to 7, and you can also color code each one to make everything more visually intuitive for respondents (having “Strongly Agree” in green and “Strongly Disagree” in red, for instance).
- The 1-10 scale – This metric involves having respondents offer an answer to your question in the 1-10 range. Generally, the 7-10 segment is associated with positive responses (if you’re asking customers how easy it was to do something, for instance). However, if your question asks the respondent to rate the level of effort, the 1-3 segment will be associated with positive results instead (since they represent low effort).
- The 1-5 scale – In this case, the answer options are as follows: Very Difficult – Difficult – Neither – Easy – Very Easy, and they are numbered from 1 to 5. You can also reverse the order.
- Emotions Faces – While this metric is pretty simple, it’s useful if you run a lot of CES surveys for minor aspects of your product/service/website. Plus, it also makes it easy and intuitive for respondents to quickly answer. Basically, you use Happy Face, Neutral Face and Unhappy Face images as responses, with the Happy Face usually meaning there was little effort required.
When Is the Right Time to Send a CES Survey?
Generally, CES web surveys are sent to customers during these key moments:
After an Interaction That Led to a Purchase
Sending out Customer Effort Score surveys after a client interacts with your product/service or service team and ends up purchasing is a great way to collect real-time feedback about what improvements you need to make to streamline the buying experience.
For example, you should always send out a CES survey after a customer signs up for a free trial or finishes the onboarding period. That’s especially relevant since poor onboarding accounts for 23% of average customer churn. It helps you quickly determine if any adjustments are necessary to make others more likely to buy from you.
Right After a Client’s Interaction with Customer Service
Sending out a CES survey after a customer service touchpoint (such as email support tickets) lets you quickly assess the efficiency of your support team and identify areas for improvement to boost overall performance. You should also consider sharing such a survey after a customer finishes reading a Knowledge base article, since it will help you find out how helpful your content is.
In this case, sending out CES questionnaires at a specific interval is unnecessary. Since the question asks respondents how much effort they had to put into solving a problem, it makes more sense to deploy the survey after customer service touchpoints.
After Any Interaction Surfacing Usability Experience
The CES survey can be sent out after any interaction that, in one way or another, could cause friction and result in a negative customer experience. This can be related to the launch of a new feature to follow up on its adoption and inquire about potential pain points, to learn more about the efficiency of your internal processes or the overall usability experience.
The question needs to revolve around that interaction and be triggered upon its completion to make recollections accurate and the received feedback actionable. With Retently now you can create survey questions tailored to specific events with just a click, so you are never short of ideas.
How to Put Together a Good Customer Effort Score Question
For starters, make sure the wording is as unambiguous as possible. Don’t ask customers about anything that doesn’t have to do with customer effort. Also, the tone of your question should be neutral so that the respondent doesn’t feel like you’re trying to favor a particular answer.
Ideally, you should also avoid using the word “effort” – ironic, we know. That’s because the word’s meanings can differ from language to language, so there’s a chance you might get irrelevant answers.
And make sure your CES survey question marks off an area of analysis – be it the overall experience a customer had with your website/brand, or just a singular customer interaction moment (like live chat).
Lastly, there are two ways to format the question:
- Make it a statement – This format is handy when using the 1-7 Likert scale. Here’s a Customer Effort Score question example of that: “How much do you agree with the following statement: The company’s website makes buying items easy for me.”
- Make it a direct question – This format is more suited to surveys that use the 1-10 and Happy/Unhappy face metrics. Here’s an example: “How much effort did it take to solve your problem?”/”How difficult was it for you to solve your problem?”
We personally recommend using the statement format – both because the 1-7 Likert scale is more accurate to work with when calculating your CES score, and because the direct question format usually relies on using the word “effort,” which we already mentioned can be a bit problematic if you have an international client base. If that’s not a concern, though, the direct question format can work well too.
How to Interpret Customer Effort Score Results
One of the easiest ways to measure CES results is to get an average score (X out of 10). This is generally done with the 1-10 Customer Effort Score scale. Simply take the total sum of your CES scores and divide it by the number of responses you have received.
So, if 100 people responded to your Customer Effort Score survey, and the total sum of their scores amounts to 700, that means your CES score is 7 (out of 10).
If you’re using other metrics (like Happy/Unhappy faces or an Agree/Disagree scale), you could also try performing a Customer Effort Score calculation by subtracting the percentage of people who responded positively from the percentage of respondents who offered a negative response. The neutral responses are normally ignored.
For instance, let’s say you had 400 respondents; 250 of them responded positively and the rest negatively. By subtracting 37.5% negative answers (150/400 x 100) from the 62.5% positive answers, you get a CES score of 25%.
If you’re using the 1-7 Disagree/Agree scale, we also found it’s best to divide the total number of people who offered a 5-7 response (Somewhat Agree – Agree – Strongly Agree) by the total number of respondents. Afterward, multiply the result by 10 or 100 (depending on whether you use a 1-10 or 1-100 scale). You could do the same with the 1-5 scale (with 4 and 5 being the positive responses).
Here’s an example – if you had 100 respondents and 70 of them offered a positive response, your CES score will either be 7 or 70 (70/100 x 10/100).
What Is a Good Customer Effort Score?
The answer is a bit tricky – mostly because performing a Customer Effort Score benchmark against competitors is difficult since there is no clear industry-wide standard to compare against, and also because whether or not your CES score is a good one depends on your Customer Effort Score question and the metrics you use.
After all, if you use the Disagree/Agree scale for answers, where “Strongly Disagree” is numbered with 1 and “Strongly Agree” is numbered with 7, and have a statement like “The company made it easy for me to solve my problem,” you’ll clearly want to have a high CES score – ideally, one that’s over 5/50.
On the other hand, if your CES survey response scale associates 1/Happy Face metrics with “Less Effort” and 10/Unhappy Face metrics with “A Lot of Effort”, and directly ask customers how much effort they had to put into performing a certain action, you should strive to have a low CES score.
As in the case of other customer satisfaction scores, in order to get a grasp of where you stand, you should compare the CES with your score over a specific period of time, to see if your efforts are paying off. If you are experiencing an increase, it means you are on the right track; otherwise, you should dig deep into customer feedback to see what you are missing.
Customer Effort Score – The Good and the Bad
CES has both good and bad sides, but let’s see if the cons pale compared to the pros of using such a survey.
Advantages
One of the great things about CES surveys is that they are actionable and specific – they can quickly show which areas need improvement to streamline the customer experience.
Besides that, Customer Effort Score results have been found to be a strong predictor of future purchase behavior. In fact, according to this HBR’s research, approximately 94% of customers who reported they experienced “low effort” in interactions with a business said they would buy again from it. Also, 88% of those consumers said they would spend more money too.
The same research also shows us that CES can give you an idea of how likely your customers are to refer your brand to others, and how they would speak of it. Basically, around 81% of customers who reported they put in a lot of effort when interacting with a business said they intended to speak negatively of the brand in question. So, it’s possible to assume that consumers who are happy with the low level of effort asked of them will likely recommend the brand to others or, at the very least, speak positively of it.
Disadvantages
While CES drawbacks aren’t really a deal-breaker, they are worth highlighting. For one, the Customer Effort Score can’t really tell you what kind of relationship a consumer has with your brand in general. A low effort score can improve customer satisfaction levels, but it does not necessarily point to loyalty toward a brand. Also, CES can’t tell you how your customers and their ratings are influenced by factors such as your competitors, products, and pricing.
Another issue worth mentioning is the fact that CES surveys don’t offer a way to segment customers by type. While the Customer Effort Score surveys have good predicting purchase power, this is limited to only a specific group of customers who, for example, interact with the support team or go through your self-service options. Since CES are transactional in nature, they focus only on specific interactions and, therefore – a limited group of users.
As one-off surveys, CES offers data with short-term relevancy. Hence they must be triggered right after and wherever the said interaction or transaction takes place (be it email, in-app or chat), so that the feedback is tied to context.
And lastly, CES surveys can tell you a customer had difficulty solving a problem, but they don’t tell you why. For example, if a consumer says it was hard to try and get something your brand can’t actually offer, that’s not a relevant result for your business.
Is Only Using Customer Effort Score Surveys Enough?
While CES app surveys are a great source of customer insight, it’s better when you pair them with a satisfaction-oriented survey – like Net Promoter Score, for example.
For the sake of this article, we’ll throw in a quick definition: NPS is a customer satisfaction survey that asks consumers how likely they are (on a scale from 0 to 10) to recommend your brand to other people. NPS surveys allow you to send follow-up questions to ask why the customer gave a particular rating, essentially letting you find out what exactly you need to improve to boost customer loyalty.
Using CES alongside NPS will let you accurately measure both consumer effort and loyalty. The two metrics effectively complement each other and allow you to focus on two vital aspects of your business instead of just one – especially since NPS lets you segment customers. Moreover, it seems that top-performing low-effort companies tend to have an NPS that is 65 points higher than top-performing high-effort businesses, further showing the link between Customer Effort Score and Net Promoter Score.
Bottom line
CES is one more transactional instrument in your toolbox that can help you pinpoint weaknesses across service interactions and a product’s ease of use. Since customer experience expectations are ever-evolving, consistently keeping an eye on effort scores, overall customer satisfaction and trends in the data is already a necessity. However, more data isn’t necessarily better data. The metrics have value only when the respective feedback is converted into follow-up actions and product improvements.
Whether you are looking for a single customer satisfaction metric or a more complex approach, Retently got you covered. You can have all your data – NPS, CSAT, CES – under one roof with insightful analytics helping you sift through the conglomerate of feedback. Sign up for your free trial to see for yourself how easy it is to get started.