Micro-surveys: a faster way to learn about your users

Sasha Lubomirsky
GV Library
Published in
5 min readFeb 15, 2012

--

First, a question: what prompted you to click on the link to this post? Yes, this one—the one you’re reading now.

At this point, you’re probably able to answer that question with relative ease. If someone asked you this in a week, you would probably just look blankly at the questioner: what post?

Micro-surveys are short, targeted, and timely questions like the one above.

Imagine a micro-survey that asks “What prompted you to leave the registration page?” of a user that just left the registration page.

Now compare that to a broader, more traditional survey that asks questions like “What is unappealing about our service?” several weeks later.

General questions, sent long after the relevant behavior, can be difficult to answer. And people are a lot more likely to answer one quick question than a lengthy list of questions. While traditional surveys certainly have a place in a researcher’s repertoire, micro-surveys can be a very effective, targeted tool that can not only help get you useful data, but be relatively easy to implement and analyze.

Delivering your micro-survey

There’s a couple different ways you can deliver a micro-survey. Good old-fashioned email is a great way to get a response. It’s easy to set up — you don’t need anyone’s help to send an email, hopefully — and users are more likely to respond if they don’t have to click a link.

When writing the email, don’t try to get fancy and make the email look official. If it looks like it was sent by a real person, it’s much less likely to be ignored. Be friendly but brief — aim for no more than four sentences total. For example, you might send an email that says:

Hey June, We noticed you just tried out [Our Awesome New Feature™] but left pretty quickly. Did anything in particular prompt that? We’re experimenting with it, so any feedback you have would be much appreciated!
Thanks, Sasha

Another way to deliver a micro-survey is a targeted in-page question that pops up after a specific behavior. Qualaroo, for example, lets you show a quick survey on specific URLs to users matching particular criteria (e.g., you can show the survey to returning users only.)

The downside of this option compared to email is that if you don’t have access to the codebase, you’ll have to ask an engineer to add the survey code to your site. Once the code is in place, it’s pretty straightforward to turn the survey on and off.

This option is also nice because it asks the question immediately after the specified behavior, and the question appears right on the page — there’s no link to click.

This can appear on the lower right of the page and is thus relatively unobtrusive

Making your micro-survey awesome

Once you’ve figured out how to deliver the micro-survey, there are a couple things to keep in mind:

  • Ask focused questions about specific parts of the experience. Answers to broad questions will include a lot of variety and be more difficult to draw conclusions from. For example:

Not good: “What is different than you expected in using the site?” on a browsing-oriented page.

Good: “What did you expect to happen when you clicked on ‘Buy it’?” after a user has left the “Buy it” page.

  • Be brief: No more than 3 questions. Duh. Otherwise it’s just a “survey.”
  • The number of responses you need varies. Depending on how straightforward your question is, you may see an overwhelming pattern after 20 responses, even though it’s not statistically significant. If you have no idea, aim for a couple hundred. But if that’s not possible, it’s still worth gathering responses to see if there are any emerging patterns.
  • Open-ended questions are best. Multiple-choice and rating questions are fine, but they’re most meaningful when you know what kind of responses to expect (e.g. from previous research) and when you’re tracking changes in responses over time.
  • Use your micro-survey as an invitation to chat more. One or two questions miss a lot of context, and you’re relying on self-reporting which is never 100% reliable. As a result, you should follow up with some users to get more detail. Consider closing your survey with: “Thanks for your help! Are you interested in chatting further?” and asking for the person’s email.

Analyzing your responses

Once you start getting responses, you’ll want to look for patterns.

Usually it’s helpful to collect the responses in a spreadsheet. If you’re using an automated survey tool like KISSinsights, you can download a CSV file from the application. If you’re using email, you’ll have to manually copy and paste responses into a spreadsheet.

As you review the responses, give each one a category like “bug”, “product too expensive,” etc. You’ll start to see patterns. As you read through them, you’ll keep improving the categories. Some will simply be one-offs that you put in the “misc” category.

I also have a column titled “Good quote” which I mark when a quote is representative, rich with detail, or just hilarious. (One of the joys of analyzing open-ended survey results is reading the unintended gems users leave along the way. My favorite recent one was in Russian and included a phrase that roughly translated to, “People who use social networks are lazy hooligans!!”)

Once you’ve categorized all the responses, count them up. In Google Spreadsheets and Excel, you can get the percentage for each category by using =countif(range, criteria). For example: =countif(B1:B200, “product too expensive”)/200 if you have 200 rows in column B (where categories are listed).

When you look at the results, see if there’s a particular story to tell. Does this help the team understand the problem? Does it help prioritize features? When you share the results, be sure to include those representative quotes so your team has a clear (and perhaps empathetic) understanding of each category. Voilà! You and your team now know more about your users than you did a day or two ago.

Traditional surveys are still useful, but micro-surveys are a great tool for getting quick feedback when you’re trying to move fast and stay user-centered.

Have you tried micro-surveys? What have you found more or less useful?

--

--