In Invisible Women, Caroline Criado Perez delivers something all too rare: a book about gender that’s really about data, how we collect it, who we collect it from, and what happens when entire populations are left out of the picture.
The premise is deceptively simple: when women aren’t counted, they don’t count. And across nearly every system we rely on – urban planning, healthcare, product development, economic policy – that’s exactly what’s happened. The result isn’t overt misogyny; it’s a quieter, more insidious kind of harm: one born from datasets designed with blinders on.
🔍 Key Themes: A Study in Flawed Data Design
- The Gender Data Gap
We like to think of data as objective, but Perez shows how even our most trusted systems are riddled with blind spots. Women are underrepresented in the data that informs decisions, making their realities statistically invisible. - The Default Male Fallacy
Many datasets and models treat the male body and male behavior as the standard. From crash test dummies to drug dosages, systems optimized for men put women at greater risk simply by treating them as outliers. - Uncounted Labor, Unseen Impact
Economic models largely ignore unpaid caregiving and domestic work, despite it being the backbone of global productivity. The failure to quantify it distorts everything from GDP to public policy. - Medical Research Bias
Clinical trials, diagnosis protocols, and symptom models often exclude or misrepresent women. The consequences are not just inconvenient, they’re deadly. - Missing Voices = Missing Variables
When women aren’t present in policy rooms, dev teams, or research boards, the data guiding decisions doesn’t reflect their experiences. What gets left out isn’t random, it’s patterned.
📌 Statistical Blind Spots That Matter
- Crash Test Dummies: Designed around male bodies. Result? Women are 47% more likely to be seriously injured in a car crash.
- Unpaid Work: Women do 75% of the world’s unpaid labor, but this is uncounted in economic analysis like GDP.
- Clinical Research: Until the 1990s, women were routinely excluded from drug trials. Today, many medications are still dosed based on male physiology.
- Transit Planning: When Sweden prioritized pedestrian routes (used more by women) over roads, hospital admissions for falls dropped – and so did public spending.
- Tech Design: Apple’s Health app launched with no menstrual tracking. Office thermostats are calibrated to the metabolic rate of a 154-pound man. Voice recognition systems perform worse for women.
💬 Why It Matters
Invisible Women is a must-read for anyone who works with data, designs systems, or makes decisions at scale. It’s a reminder that bias isn’t just an interpersonal problem, it’s a data architecture problem. If the inputs are flawed, the outputs will be too.
Perez doesn’t shout. She shows. With case studies, research, and hard stats, she reveals how the systems we’ve built are only as fair as the data behind them. And if we want equity, not just in theory but in practice, we need to start by asking: who are we measuring, and who are we leaving out?


Leave a comment