Experience has certainly been getting a lot of buzz over the last couple of years. Amongst just some of its proponents it has been attributed to leading to competitive advantage, customer loyalty and attraction.
And it looks like the domain is only set to grow in business focus: in 2016 we will see, "the gap between customer-obsessed leaders and laggards widen. Leaders will tackle the hard work of shifting to a customer-obsessed operating model; laggards will aimlessly push forward with flawed digital priorities and disjointed operations." And, "in just a few years, 89% of businesses will compete mainly on customer experience", so companies that want to succeed are really getting to grips with understanding and enabling customer centric design and focus throughout their organizations - including their own employee experience and their user experience.
One area that has also seen a transformation and growth is how to measure an experience in a tangible and actionable way? A quick Google search will bring you back plenty of hits on thoughts around this.
For year's companies have generated "customer experience surveys" and trended that wonderful C-Sat percentage on scorecards and dashboards galore. But what does 7/10 actually allow you to do? What happens when you see a "green" dashboard but still have customers leaving unsatisfied. I was in a meeting once with a Client and they referred to this as a "watermelon"; all green on the outside, but red as can be inside. NPS has also seen traction and is indeed an easily digestible health check. While these all have their purpose and a critical role to play in holistic understanding, often, to action the trend, further and deeper insights are needed and no one measure alone will give the full picture.
As we spoke about at SXSW 2015, the perception gap between brands and their customer's views on experience largely leads to this "watermelon effect". This is when the provider believes everything is great and the customer or user; not so much. One example that I will never forget was in the Customer Service industry. An acceptable "Answer Time" SLA had been set, but the scorecard was red for this measure. To solve the issue, the agents picked up the phone, said "hello" and then promptly put the customers on hold. All of a sudden the scorecard turned green, but the customers were still unhappy - obviously. The thing is, who determined this business SLA? The customers were not there when that deal was struck, so who deemed the SLA satisfactory for them?
Don't get me wrong; business and operations need their SLA's and measures. But, experience is a different type of measure. C-Sat may do well for general trending, but uncovering what levers affect your customer experience is key.
It is worthwhile to remember that Experience is about emotion. It is about how your product, service, brand or solution makes your customer feel. When measuring your customer's experience, remember this is actually their perception of the interaction. This type of data has several facets to think about - and here are just a few:
Is it Actionable?
So you go to the trouble to ask customers how they would rate their experience on a scale of 1-10...now what? So 86% say 6/10 or higher...what will you do with that? What does 6/10 even mean? Why did you think 6/10 was good? It is a good idea to always think to yourself when asking for any data - what can I do with it? Can I improve the experience, or innovate a new experience with this data?
Of course, as I mentioned, having a general trend is good - but so many people forget to dig a little deeper into their insights to really see which part of the experience is broken or needs to be changed.
When do you Capture the Data?
Lets not beat around the bush. Bad data equals bad decisions. Data can be skewed any way you want. In the case of Customer Experience and surveys, you have numerous potential skews. How did they feel at the time? Did you catch them at a bad moment of their day? When the mind is in a "negative effect", the focus is narrowed so that tiny little issue that they usually wouldn't notice now comes back to them and they lower their score. Or did you just get the customer who feels a little bad to reveal their true thoughts, so they just click 5/10 on everything?
Sure, the law of numbers could minimize the effect, but this is not a gamble I'd take when making core decisions and defining strategies that could lead to business success or failure. It is also better to gather those core deep and meaningful insights from the people that are your customers or potential customers, in addition to trend data.
Who is generating the data?
Today, with the web of the customer journey, there are several moments to collect data, and data can be system- or human-generated. While a system could be collecting the time of a call, or how long someone was on hold, the customer could be generating their own thoughts and comments.
The careful and purposeful design of what data you use and for what is incredibly important in building a clear view of the experience your customers are receiving.
It is worth mentioning here that system- verses human-generated is not the same as digital versus physical. For example, you can capture human-generated data both digitally and physically so to speak. For example, you could capture their online sentiment from social streams, or "physically" via interviews.
What are you Asking?
This is certainly a case of garbage in garbage out. To be honest, if you ask pointless questions, then you can expect pointless responses. If you do not know how your customers measure you, then are you sure you are asking the right questions? Do you have questions like - how fast and easy was the service? When they say yes, did they mean yes it was fast, yes it was easy or yes to both? Enough said.
How are you Capturing the Data?
There are several ways to capture data; determining the best way for you will depend on what the purpose is and how you plan to use it. We often see a mix of observations for providing deep insights, interviews and surveys, as well as social and other data analytics. The key always comes back to knowing what you need to measure to get an actionable view of your customer's experience.
How Experience Design Thinking Can Help
At effectUX we help our clients firstly to uncover their "Experience Factors" using our Experience Factor DecompositionTM methodology. Then we look at where and how to collect this data effectively within their experience ecosystem, using both system- and human-generated data.
Given that Experience is all about how people feel and perceive their interactions with your brand, we see it as useful to consider two things in the quest to usefully measure your customer experience.
Firstly, it all comes down to taking the time to figure out what actually affects your customer experience. How are they measuring you? Don't guess this, work with actual humans and find out. Then, secondly, you can figure out where and how to effectively measure this at scale for the various interactions, channels and touch points throughout the customer journey.
And it looks like the domain is only set to grow in business focus: in 2016 we will see, "the gap between customer-obsessed leaders and laggards widen. Leaders will tackle the hard work of shifting to a customer-obsessed operating model; laggards will aimlessly push forward with flawed digital priorities and disjointed operations." And, "in just a few years, 89% of businesses will compete mainly on customer experience", so companies that want to succeed are really getting to grips with understanding and enabling customer centric design and focus throughout their organizations - including their own employee experience and their user experience.
One area that has also seen a transformation and growth is how to measure an experience in a tangible and actionable way? A quick Google search will bring you back plenty of hits on thoughts around this.
For year's companies have generated "customer experience surveys" and trended that wonderful C-Sat percentage on scorecards and dashboards galore. But what does 7/10 actually allow you to do? What happens when you see a "green" dashboard but still have customers leaving unsatisfied. I was in a meeting once with a Client and they referred to this as a "watermelon"; all green on the outside, but red as can be inside. NPS has also seen traction and is indeed an easily digestible health check. While these all have their purpose and a critical role to play in holistic understanding, often, to action the trend, further and deeper insights are needed and no one measure alone will give the full picture.
As we spoke about at SXSW 2015, the perception gap between brands and their customer's views on experience largely leads to this "watermelon effect". This is when the provider believes everything is great and the customer or user; not so much. One example that I will never forget was in the Customer Service industry. An acceptable "Answer Time" SLA had been set, but the scorecard was red for this measure. To solve the issue, the agents picked up the phone, said "hello" and then promptly put the customers on hold. All of a sudden the scorecard turned green, but the customers were still unhappy - obviously. The thing is, who determined this business SLA? The customers were not there when that deal was struck, so who deemed the SLA satisfactory for them?
Don't get me wrong; business and operations need their SLA's and measures. But, experience is a different type of measure. C-Sat may do well for general trending, but uncovering what levers affect your customer experience is key.
It is worthwhile to remember that Experience is about emotion. It is about how your product, service, brand or solution makes your customer feel. When measuring your customer's experience, remember this is actually their perception of the interaction. This type of data has several facets to think about - and here are just a few:
Is it Actionable?
So you go to the trouble to ask customers how they would rate their experience on a scale of 1-10...now what? So 86% say 6/10 or higher...what will you do with that? What does 6/10 even mean? Why did you think 6/10 was good? It is a good idea to always think to yourself when asking for any data - what can I do with it? Can I improve the experience, or innovate a new experience with this data?
Of course, as I mentioned, having a general trend is good - but so many people forget to dig a little deeper into their insights to really see which part of the experience is broken or needs to be changed.
When do you Capture the Data?
Lets not beat around the bush. Bad data equals bad decisions. Data can be skewed any way you want. In the case of Customer Experience and surveys, you have numerous potential skews. How did they feel at the time? Did you catch them at a bad moment of their day? When the mind is in a "negative effect", the focus is narrowed so that tiny little issue that they usually wouldn't notice now comes back to them and they lower their score. Or did you just get the customer who feels a little bad to reveal their true thoughts, so they just click 5/10 on everything?
Sure, the law of numbers could minimize the effect, but this is not a gamble I'd take when making core decisions and defining strategies that could lead to business success or failure. It is also better to gather those core deep and meaningful insights from the people that are your customers or potential customers, in addition to trend data.
Who is generating the data?
Today, with the web of the customer journey, there are several moments to collect data, and data can be system- or human-generated. While a system could be collecting the time of a call, or how long someone was on hold, the customer could be generating their own thoughts and comments.
The careful and purposeful design of what data you use and for what is incredibly important in building a clear view of the experience your customers are receiving.
It is worth mentioning here that system- verses human-generated is not the same as digital versus physical. For example, you can capture human-generated data both digitally and physically so to speak. For example, you could capture their online sentiment from social streams, or "physically" via interviews.
What are you Asking?
This is certainly a case of garbage in garbage out. To be honest, if you ask pointless questions, then you can expect pointless responses. If you do not know how your customers measure you, then are you sure you are asking the right questions? Do you have questions like - how fast and easy was the service? When they say yes, did they mean yes it was fast, yes it was easy or yes to both? Enough said.
How are you Capturing the Data?
There are several ways to capture data; determining the best way for you will depend on what the purpose is and how you plan to use it. We often see a mix of observations for providing deep insights, interviews and surveys, as well as social and other data analytics. The key always comes back to knowing what you need to measure to get an actionable view of your customer's experience.
How Experience Design Thinking Can Help
At effectUX we help our clients firstly to uncover their "Experience Factors" using our Experience Factor DecompositionTM methodology. Then we look at where and how to collect this data effectively within their experience ecosystem, using both system- and human-generated data.
Given that Experience is all about how people feel and perceive their interactions with your brand, we see it as useful to consider two things in the quest to usefully measure your customer experience.
Firstly, it all comes down to taking the time to figure out what actually affects your customer experience. How are they measuring you? Don't guess this, work with actual humans and find out. Then, secondly, you can figure out where and how to effectively measure this at scale for the various interactions, channels and touch points throughout the customer journey.
-- This feed and its contents are the property of The Huffington Post, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.