Skip to main content
Surveys are the most overused and most misused research tool in the product toolkit. They’re easy to send, they produce numbers, and numbers feel like evidence. The problem is that bad survey data isn’t just useless - it’s actively misleading. You end up making confident decisions based on what people said they’d do, not what they actually do 😬 Used well, though, surveys are genuinely useful. The key is knowing what they’re good for - and what they’re not.

What surveys are good for

Quantifying what you already know qualitatively - You’ve run ten customer interviews and heard the same three pain points. A survey can tell you how widespread those pain points are across your full customer base. Interviews find the signal; surveys measure its scale. Tracking changes over time - NPS, CSAT, and similar metrics are survey-based for a reason. Their value isn’t in any single data point but in the trend. Are customers happier or less happy than last quarter? Reaching customers you can’t interview - Some customer segments are hard to get on a call. A well-designed survey can give you signal from people who’d never book a 30-minute interview.

What surveys are bad for

Discovering unknown problems - A survey can only ask about things you already know to ask about. It can’t surface the workaround the customer invented that you’ve never heard of. That’s what customer interviews are for. Understanding why - “60% of users find onboarding confusing” is a finding. It tells you nothing about what’s confusing or how to fix it. Surveys surface what; interviews surface why. Predicting behaviour - As Rob Fitzpatrick argues in The Mom Test, people are terrible at predicting their own future actions. “Would you use this feature?” is a survey question that generates noise, not signal.

Writing better survey questions

Most survey questions are leading, vague, or hypothetical - and produce responses that confirm whatever the team already believed. A few rules:
  • Avoid leading questions - “How much do you love our new dashboard?” is not a question.
  • Ask about the past, not the future - “How often did you export data last month?” beats “How often would you use an export feature?”
  • Use open-ended questions sparingly but deliberately - They’re harder to analyse but often contain the most useful insight. One good open-ended question at the end of a survey beats ten closed ones.
  • Test your survey on someone first - If they have to ask what a question means, rewrite it.

Response rates and sample size

A survey with a 5% response rate has a self-selection problem - the people who responded aren’t a random sample of your customers. They’re the ones who felt strongly enough to reply, which skews the data in ways that are hard to correct for. Chase response rate before you chase sample size. A short, well-targeted survey sent to the right segment at the right moment beats a long survey blasted to your entire list 📊 Lesson learned: the most useful thing I ever got from a survey was the open-ended “anything else?” box at the end. Customers will tell you things in a text box that no multiple-choice question would ever surface 👀