Skip to main content
Most product teams have more data than they know what to do with. Dashboards full of metrics, event tracking on every click, weekly analytics reviews. And yet the question “why is retention dropping?” still gets answered with guesses in a meeting. Data and insight are not the same thing 🔍

The difference

Data is a number: retention dropped 8% last month. Information is context: it dropped specifically for users who signed up via the mobile app. Insight is a conclusion with a direction: mobile onboarding isn’t reaching the activation moment - users are dropping before they complete the first core task. We should fix the mobile onboarding flow. Most teams stop at data. Some get to information. The ones that consistently act well get to insight - and they do it not by having better data, but by asking better questions 💡

The process

Start with a question, not a metric - “what’s happening to retention?” is better than “let me check the retention dashboard.” A question gives you something to follow. A dashboard gives you somewhere to stop. Look for anomalies and differences - flat averages hide stories. Segment the data (by acquisition source, activation status, cohort, device) and look for where the numbers diverge. That divergence is usually where the insight lives. Combine quantitative and qualitative - data tells you what and where. Qualitative research - customer interviews, session recordings, usability testing - tells you why. Neither is sufficient alone. State the so-what explicitly - an insight isn’t an insight until it implies an action. “Mobile retention is lower” is not an insight. “Mobile retention is lower because the onboarding flow requires a desktop-only step” is an insight. The action follows directly.

The role of hypotheses

The best analysts and PMs work from hypotheses, not fishing expeditions. Before opening the dashboard, form a hypothesis: “I think retention dropped because the new onboarding flow reduced activation for users who skipped the tutorial.” Then look for data that confirms or refutes it. This is faster, less susceptible to confirmation bias, and teaches you more even when you’re wrong 🙌 How To Measure Anything by Douglas Hubbard is a rigorous read on the discipline of measurement and what it actually means to reduce uncertainty with data. Lesson learned: the most useful thing I ever did for a team’s analytical culture was ban the phrase “the data shows” without a follow-up “which means we should.” Data doesn’t show anything on its own. Insight does.