Use feedback surveys: let your customers tell you how to make your site better

by Tim Leighton-Boyce

Picture of Voice of customer survey word cloud
People are really helpful, given half a chance. Your customers will tell you what they want and how to make your site work better, if you’ll just let them.

Surveys are the secret.

I’ve written about “voice of customer” surveys before and I’ll probably write about them again.

That’s because surveys are such a great source of direct guidance on what you need to do to make your site better for your visitors and better for you. Face to face interviews and usability tests can be even more valuable, but they’re more expensive to run and to analyse.

A couple of things have just come up which bring me back to the subject.

  • One was seeing some very interesting data about the different methods for collecting survey data.
  • The other was an example of how not to do it: a personal experience of a survey which failed to get the vital information. Both examples provide evidence of why an embedded “any comments” free text survey in plain view, right on the order confirmation page, is a killer app for improving ecommerce sites.

Data on Survey Response Rates

iPerceptions added a new type of survey to their range of (very good) systems in May 2012. They’re called “Comment Cards” and are more of a visitor-feedback system than a survey tool. The key point about these is that they use one of those small invitation tabs down at the bottom of the screen:

Screenshot of iPerceptions feedback invitation

This kind of “solicitation” is one used by several other feedback systems and is less intrusive than the overlays which iPerceptions use on some of their other systems, such as their famous 4Q and their enterprise level webValidator.

At the time of the launch iPerceptions published a blog post explaining the features of the new “Comment Card” system. Near the end of the article there was a section on the disadvantages of this kind of “Visitor-initiated Feedback” system. It contained some data which caught my eye:

“Visitor-initiated feedback garners a significantly lower response rate, averaging 0.1% of website traffic, versus active solutions which boast a 2-5% response rate.”

The 2-5% for an active solution is the kind of figure I expected, although I have never written about it before because I did not have a source I could trust. This data is from one of the major vendors of these systems, so I imagine that the sample size was big. This means that these response rates can be treated as some kind of a benchmark.

I’ve never been keen on using active solutions on an ecommerce site. It just doesn’t seem wise to float a survey invitation in front of a potential customer. We normally strip out anything which could get in the way of someone trying to buy something from our sites. And according to this data only about 5% would fill in the survey, meaning that the other 95% clicked the “get this thing out of my way” button. Ouch.

The visitor-initiated systems don’t have the same problem. 99.9% of people ignore them. So they collect very little data. And if someone does click on the button, it’s normally because they want to complain, as iPerceptions point out:

“Visitor-initiated feedback produces much more negative feedback than an active solicitation, which captures a representative sample of negative and positive views. As a result, it can be difficult to give appropriate weight to the feedback, or consider the feedback in the context of visitors’ overall experience.”

I was really interested by this data because of the way it compares with the method some of my clients use.

The ecommerce customer feedback system I encourage you to try doesn’t need an invitation. It’s a simple form embedded in the order confirmation page. Like this:

Screenshot showing example of embedded voice of customer survey form

A form on the ‘thank you’ page cannot get in the way of a purchase.

People will post praise as well as complaints. The split is usually around 50/50 and knowing what language your customers use to describe your strengths is extremely valuable for SEO and marketing copy.

Wordcloud showing positive customer feedback

Don’t worry: you will also get to hear plenty about bugs and usability problems which forced some people come back to try again later.

And the response rates?

I’ve just looked at a sample of 100,000 orders. 35% of customers left some feedback on their order. 10% left a free text comment. That’s 10% of orders, not 10% of survey responses.

To emphasise the difference, I’ve also looked at an example where the site changed from using an embedded survey to user-initiated ‘click here to provide feedback’ link. The response rate fell from 10% to below 1% and the content of the comments shifted to strongly negative. Give people a form, right there on the page…

Update June 2013 But see new data below showing how an uninteresting or frustrating ‘thank you’ page can halve these rates.

Those comments are the real seam of gold. Like I said, people are very helpful if you give them the chance.

To find out more about these surveys, here’s the start of a two-part series I wrote for Econsultancy: http://econsultancy.com/uk/blog/9134-best-practices-for-e-commerce-consumer-surveys [Opens in new tab]

And, since it would be daft not to do so, here’s an example of a live embedded survey form. It’s a very crude one, using a free account from SurveyGizmo. The survey is not pretty but it has the fields I need and I’m sure you could style one better than this.

The key reason for using Survey Gizmo for this example is that survey Gizmo allows me to also pass information into GA as a script within the survey’s ‘Thank you page’. In this instance I am recording the fact that someone has completed the survey as a ‘visitor scope’ custom variable, so that I can see if survey respondents return to the site in another session. I’m also recording the values of two of the questions (task completion and Net Promoter) as events. I chose an event for the NPS so that I can easily report on the average for that value in GA, which is useful to see even though it does not do the actual NPS calculation.

The Survey Gizmo code I use for the ga.js on this site (and thank you for S Gizmo support for helping me sort this out) is:

$(document).ready(function() {
_gaq.push(['_setCustomVar',4,'Survey Completed','Yes',1]);
_gaq.push(['_trackEvent', 'survey', 'Completion answered','[question("value"),id="2"]']);
_gaq.push(['_trackEvent', 'survey', 'NPS answered','[question("value"),id="21"]']);
});

The custom variable will only be sent along with the next hit to GA. In this case there’s no problem because the thank you screen is also sending events. But that’s something to consider if you just want to use custom variables. You would need to edit the question IDs and other details to suit your own survey, of course.


How Not to Do It

As you can tell, I'm a big fan of customer surveys. But you must give people the chance to tell you what matters to them. A free text comment field is vital, even if it's hard work analysing it. I know about the work. We sometimes process more than a thousand of these per week as part of our routine analysis.

I recently had a personal experience which drove the point home to me. I used a web site to book my car in for some work. There was a problem with the site which meant that I did not receive the necessary confirmation. I got a raw system error, in fact. And that made me very worried!

I emailed the company to ask if the booking had been made. The contact centre were fine and sent me a .pdf copy.

But the experience had left me wondering if there might have been other technical issues. For example, I wondered if the booking would actually have made it through to the branch in question.

I had lost confidence. I did not want to turn up at 8:30 in the morning only to be told that I would have to come back another day. So I phoned the branch. All was fine and they provided their usual excellent service.

A while later, an email arrived asking me to complete a survey. I wanted to help. I wanted to explain how the experience on the web site had left me feeling uncertain about the rest of their booking system and that the side effect of this was still uncomfortable even after the reply from the contact centre.

Given what I do for a living, I often record my own use of other sites. Here's what happened:

The summary version:

The survey included a series of questions, some of which didn't really have an appropriate answer, but I completed it as best I could.

My answers would have appeared very contradictory, because there was no way of indicating that I had been unable to complete the task. My answers suggest that I was there to do something which the site allows, and that I was doing it for a reason that would make sense. And then I seem to regard it as an unsatisfactory experience.

The survey itself does not make it clear whether there will be more questions, so I submitted the first responses in the hope that there might he a comments field. But no. That's it. I couldn't tell them about the server error. I couldn't tell them about the loss of confidence. I couldn't suggest that the email response could have included an "I've checked on the branch calendar..." line, or that maybe they should send out a follow-up email "from" the branch as well.

What a wasted opportunity.

If you're going to ask people for feedback, give them the chance to tell you what they think matters. Don't limit yourself to the things you think might be relevant. Leave space for the "unknown unknowns" because that's where the real value is.

Resources for Voice of Customer Surveys and Feedback

[All open in new window.]

  • Articles about surveys on e-consultancy on why customer surveys are so valuable and how to set up customer surveys of the kind described here
  • Smart Insights list of survey and feedback systems
  • Brandsavant have a fun guide (you read right) to analysing sentiment in free-text comments here: http://brandsavant.com/how-to-measure-online-sentiment-a-definitive-guide/ The approach we use here is more structured and less enjoyable, but we have to do it as routine work.

  • Updates


    June 2013
    I've just noticed an interesting change in the response rates when looking at a survey embedded in a weak confirmation page. By 'weak' I mean a confirmation page which just says 'Thank you for your order' and does not provide details of the order itself.

    I suspect people leave the page very quickly on this site because it does not even provide the reassurance of being able to check the order. And there's nothing else of interest. (Here's a link to 10 creative ideas for improving order confirmation pages: http://www.getelastic.com/11-ways-to-optimize-thank-you-pages/ [opens in new tab] ).

    In this case the completion rates are roughly half what I would expect. Around 15% of customers start the survey and only about 5% of all customers leave a text comment. There's still plenty of valuable insight from the comments. But the lesson here is about how engagement levels on 'thank you' pages can vary.

{ 0 comments… add one now }

Leave a Comment

Comment Spam Protection by WP-SpamFree

Previous post:

Next post: