Improve your startup’s surveys and get even better data
Startups frequently use surveys as a cheap and easy way to get feedback from users. But the resulting data will only be as good as the survey itself. I often see products with surveys that have easy-to-fix mistakes like misleading questions, improper sampling, and skewed rating scales. That’s a shame — these teams could be collecting better data and making better decisions if they just paid a bit more attention to survey design.
There are plenty of places to learn about survey design. But if you’re at a startup, you probably don’t have time to wade through a bunch of academic articles. Here are some practical tips and guidelines to help you create better surveys right away.
When to use a survey
Before designing your survey, think about the questions you want answered, and decide if surveys are the right tool for the job.
Surveys are great when you want to…
- Track changes over time — See what changes before and after a feature launch.
- Quantify issues seen in user studies — We know [x] is a problem for some users, but how many?
- Measure attitudes, intents, or task success.
But surveys are not very good at…
- Discovering underlying user motivations and needs — Interviewing users are better suited for this.
- Understanding whether people can successfully use your product — Again, user studies are better for this.
- Uncovering actual user behavior and habits — Since people are bad at self-reporting, logs analysis is better suited for this.
If a survey is the right method to answer your questions, here are some good tips for getting started.
Survey basics
Only ask what you need to know and can act on — Avoid “nice-to-know” questions, as they just increase the length of your survey. Every survey question should have an actionable outcome.
Keep surveys short — Keep your surveys under five minutes. If your survey is long, tell respondents up-front how long it’ll take.
Start broad, then move to specific and sensitive — Ask about overall experiences and then dive into detailed questions.
Group related questions together — Avoid context switching too much; group questions of the same topic together.
Randomize answers to avoid response order effects — In a vertical list, top answers tend to have the highest selection rate. Randomize the list order across participants to minimize this effect. (You can do this with Survey Monkey.)
Avoid images and be aware of your survey’s visual design — The visual design of your survey can influence how people respond. Showing an image to clarify what’s being asked (e.g. screenshot of a product feature) is fine, but having a random background image (e.g. happy people talking) can influence how people feel, and change the way they respond.
Pre-test your survey — Before launching your survey, test it by showing it to a few friends. Ask them to describe what each question is asking in their own words and whether there are any points of confusion. This is particularly important at early-stage startups where your user base may be only a few hundred people. As a result, you won’t be able to survey them again for awhile.
Watch out for these pitfalls
Avoid leading questions
Would you use this improved version of product X?
Would you like Walmart more if the aisles were less cluttered?
Leading questions suggest or guide respondents into answering in a particular way. The first example implies that the latest version is an improvement, thus leading respondents to want to answer yes. (Who wouldn’t want to use a better product?) In the second example, respondents are more likely to state that they would like Walmart more if their aisles were less cluttered because “clutter” has a negative connotation.
Avoid suggestive wording and ask questions in a neutral way. Instead of asking users if they would use an improved product, ask them what they like and dislike about existing and new versions.
Avoid agree/disagree statements
Do you agree or disagree with the following statements: I liked this article a great deal. My overall health is excellent.
While agreement statements are relatively easy to craft and analyze, they suffer from acquiescence bias — the tendency for respondents to be agreeable out of politeness or a desire to satisfice.
Instead, ask questions using an item-specific rating scale:
How satisfied or dissatisfied are you with this surveys post? ( Extremely Dissatisfied | Moderately Dissatisfied | Slightly Dissatisfied | Neither | Slightly Satisfied | Moderately Satisfied | Extremely Satisfied )
How would you rate your health overall? ( Excellent | Very Good | Fair | Bad )
Avoid “double-barreled” questions
How satisfied are you with Geico’s payment and billing options?
Double-barreled questions ask respondents about multiple issues at once (e.g. payment and billing options). Respondents may feel differently about each item, making the data essentially impossible to accurately interpret — are respondents slightly satisfied with both options? With just the payment options and not billing options? Or are they reporting their average satisfaction with both?
Instead, ask about one item per question. Ask separate questions if needed. How satisfied are you with Geico’s payment options? How satisfied are you with Geico’s billing options?
Avoid asking “why” to get at motivations
Why did you click on the ad?
People often can’t explain their own behaviors when asked about very specific instances and actions. For example, someone can probably tell you why they came to your site (overall intent), but not why they clicked on a particular link.
Instead, focus on intent: What did you come here to do today? You can use interviews and other methods to uncover core motivations and needs.
Avoid vague terms
How many times did you work from home in Q1?
Avoid using phrases and words that may be interpreted in multiple ways. While “Q1” may mean January through March to you, others may interpret the time frame differently, resulting in inaccurate reporting across participants.
Instead, ask questions in a specific way and leave little to interpretation. How many times did you work from home in the past 30 days?
Avoid hypothetical questions
Would you use feature [x] if we offered it?
Respondents can’t accurately predict how they would act in a hypothetical situation. While most people like the idea of having more features and functionality in a product, it doesn’t necessarily mean they’d use them.
Instead, ask about current experiences, intent, and frustrations. Once you figure out what people like or don’t like about a product, you’ll be able to decide which features to offer.
Avoid comparison questions
Do you like the previous or current version of Facebook better?
People often use surveys to gauge whether a redesign was successful by asking users to compare the current version to a previous version of the product.
Instead, use two separate surveys — one in the old version and one in the new version — and ask respondents to rate their experiences with the product they’re using. Not only will respondents be immersed in the experience they are evaluating, but they won’t be biased into thinking that the new version is better.
Rating scales and grids
Avoid large grid questions — Large grids will cause people to skip the question or exit the survey entirely.
Use fully labeled, equally spaced scales — Label every point on the scale so there’s no confusion (“what does 5 mean on a 7-point scale?”) and make sure each point is the same width and height.

Bipolar and unipolar questions — Bipolar questions have answers that range from negative to positive. For these questions, use a 7-point scale like the one shown above. Unipolar questions have answers that range from zero to positive. For these questions use a 5-point scale like the one shown below.

Hopefully these tips will help you start writing better surveys right away. If you’d like to learn more about survey design, here are some great books:
- Survey Methodology by Groves, Fowler, and Couper
- Improving Survey Questions: Design & Evaluation by Floyd and Fowler Jr.
- Handbook of Survey Research by J. Wright
What have you learned from running surveys at your startup? What survey design resources have you found helpful? Jump into the comments below and let’s discuss.