3 Customer Survey Lessons from this Year’s Presidential Polling Fail

By: Sid Banerjee

January 26, 2017

Tags:
Clarabridge Analytics
Clarabridge Engage
Contact Center
Customer Experience
Customer Feedback Management

The pre-election polls were way off this year — much more so than usual. While the official results may have taken some of us by surprise, there’s a lot we can learn from this year’s presidential polling fail. The truth of the matter is that customer surveys can be just as misleading. And that’s bad news if you’re building your business around survey results.

That’s not to say that surveys don’t provide valuable insight. It’s just that you need to be careful about how you structure surveys and how you interpret the results, otherwise results may not accurately represent your customer base. It’s also important to recognize that surveys are not the only technique for measuring loyalty, whether it’s voting for a candidate or recommending a product or service.  There are steps you can take to ensure you’re getting accurate, valuable information from your customers. Here are three important survey lessons CX leaders should walk away with from this year’s polling faux pas:

Survey Samples Aren’t Always Representative

It’s important to remember that surveys are imperfect — the person that responds to your questionnaire doesn’t necessarily represent the majority opinion of your customers. We watched this play out on a massive scale with the LA Times pre-polling results, which actually predicted that Donald Trump would win the election. Although the LA Times was technically right about overall election results, the publication was completely wrong in predicting how the African American population would vote. That’s because the poll surveyed just one African American voter who happened to be a Trump supporter. This single response was then skewed to represent 8 percent of the population.

The lesson here is that brands should collect as many data points from as many channels as possible. Customers who respond to post-transaction surveys are more likely to describe recent experiences as good or bad. Meanwhile, relationship surveys contain more generic loyalty insights and social customer engagement is often skewed to cries for help. Support calls, chats and emails contain disproportionately more insights on customer problems and inquiries. And forums often contain customer to customer interactions that suggest opportunities for self-service problem resolution.

If you rely primarily on just one or two sources for customer insight, you’ll get a myopic view of the customer experience and may miss important loyalty drivers and opportunities for improving the customer experience. Make sure to combine survey feedback with organic customer data collected from social media, emails, phone calls and beyond. This ensures that you get answers to the specific questions you want to ask, as well as answers to those that you didn’t think to ask. The result? A more accurate understanding of the customer journey.

Energy and Engagement Matter

There’s more to customer feedback than what meets the eye. It’s not just about what customers say, but it’s also about how they say it and how often they say it. This election season, Trump supporters were more engaged on social media than Hillary Clinton’s supporters. They were vocal, passionate and showed unparalleled energy. According to Cision, Trump supporters shared 657 percent more campaign related posts than Clinton supporters over a 30-day period during a critical period in the fall of 2016.

Social media engagement (measured by numbers of interactions), and passion (measured by the degree of sentiment, both positive and negative) have correlated to higher voter turnout in the years since social media has become an important political messaging tool. In 2012, Obama had seven times the Twitter followers than Romney, 3 times the Facebook followers and those followers showed significantly higher passion and engagement in social chatter. Ultimately, passion and engagement appears to impact likelihood to vote.

The same holds true for your customers — passion and engagement matter, and likewise appear to predict customer loyalty and the likelihood to recommend a company. When digging into customer survey feedback, social feedback, and even call center and email interactions, it’s important to take note of the qualitative content in the feedback, not just survey scores or quantitative measures of an interaction like call duration or issue resolution metrics.

Take tone, sentiment, and emotion into consideration and strive to understand the context around each piece of feedback. It’s much easier to measure energy and engagement on social, so be sure to collect all that you can from this kind of feedback. And, make sure to dig in to the open-ended questions in survey feedback – this “raw” feedback contains critical clues about the customer’s level of excitement or passion. In also unearths details about the issues and topics that truly matter to the customer. Doing so allows you to better understand your customer base.

Listen First, Ask Second

Oftentimes, surveys ask customers what the brand cares about, not what the customer cares about. The problem is that you end up with answers to questions that don’t even matter in relation to customer satisfaction. When we look at this year’s pre-polling process, it’s obvious that it involved a lot of asking and not a whole lot of listening. If pollsters had listened to the conversations voters were naturally having and asked more open-ended questions in polling surveys, it’s likely they would have had a much better sense of the issues that mattered most and how well candidates were addressing those issues.

If you take a “listen first, ask second” approach when collecting customer feedback, you’re far more likely to get a complete and accurate picture of who your customers are and what they need. Ask fewer multiple choice questions, yes/no questions, and 1-10 questions. Instead, ask more open-ended questions that respondents can answer in their own voice. Listen more to what customers are saying on social and take note of why they’re calling or emailing. That way, it’s less about the brand and more about the customers.

At the end of the day, surveys can be a valuable tool for understanding sentiment and loyalty. But on their own, surveys deliver an imperfect, incomplete view. To truly understand the full spectrum voice of the voter or the voice of your customer, learn these survey lessons: Capture feedback from all the channels used, not just one. Measure engagement and passion, not just scores and yes/no answers. And use tools and techniques to ask less, and listen more —  to elicit a truer understanding of the drivers of loyalty, engagement, and likelihood to promote, or vote.

 


Sid Banerjee is a well-known expert in customer experience, business intelligence, and text analytics. A Clarabridge co-founder, he provides Clarabridge with vision and direction for achieving customer success while also managing executive relationships with key current and prospective customers. Follow Sid on Twitter @sidbanerjee.

clarabridge_demo_request_cta