Understanding Your Biases
Episode 1Trial access

Understanding Your Biases

7:44Society
In this introductory episode, you'll dive into the world of cognitive biases, discovering what they are, why they exist, and how they unconsciously influence your decisions. By building a foundational understanding, you'll be better prepared to recognize and address these biases in your day-to-day life.

📝 Transcript

You make hundreds of decisions before lunch, yet many aren’t really “yours.” A quiet filter in your mind is steering what news you trust, who you hire, even how you remember arguments. Here’s the twist: that filter is trying to help you—and still regularly gets you into trouble.

That quiet filter doesn’t just influence big choices; it shapes the tiny ones you barely notice. Which tab you close first. Which stranger on the train you instinctively sit farther from. Which candidate’s resume “feels” stronger, even when the experience is the same. These split-second leanings aren’t random—they’re patterns your brain has learned from past shortcuts, culture, habits, and even the last headline you scrolled past.

Researchers have mapped more than 180 of these mental bends, and they reach into places we like to think are purely objective: courts, classrooms, hospitals, boardrooms. A single phrase in a policy, a photo in a news article, or the order of options on a form can quietly tilt outcomes for thousands of people. In this series, we’re not trying to delete your biases (you can’t), but to see them clearly enough to stop them quietly steering the wheel.

Some signs of these patterns are easy to spot in others—political echo chambers, lopsided panels at conferences, viral myths that refuse to die. But up close, in your own thinking, they’re much harder to catch. You’re not walking around feeling “biased”; you’re walking around feeling *right*. Data helps reveal the gap. People, on average, spend over a third more time with articles that confirm what they already believe. Small wording tweaks in a tax letter have moved hundreds of millions of pounds. If a few sentences can sway a nation’s behavior, what are your daily inputs doing to you?

Think less about *whether* you have biases and more about *where* they’re hiding and *how* they show up in daily life. Three places are especially revealing: what you notice, what you remember, and what you ignore.

Start with what catches your eye. In a meeting, whose comments feel “insightful” versus “obvious”? When two people share the same idea at different times, does one land as brilliant and the other as redundant—depending on who said it, their title, or how they speak? That’s often where implicit patterns are quietly assigning extra credibility or discounting it. The same thing happens with headlines: similar claims can feel either “common sense” or “outrageous,” depending on the source logo attached.

Then look at what sticks in your memory. After a conflict, which details do you find yourself replaying later? People routinely recall their own contributions more clearly than others’ and remember being slightly more reasonable, generous, or consistent than they actually were. Over time, this selective recall shapes a story in which your own side of events seems not just understandable, but obviously justified.

Equally important is what slips past your attention altogether. Which metrics show up on your dashboards—and which never get measured? A hospital that tracks speed of discharge but not readmission rates quietly tilts staff toward rushing patients out. A company that measures “culture fit” but not “inclusion” tends to reward sameness, then later wonders why teams look and think alike.

Here’s the uncomfortable part: these patterns are often strongest exactly where you feel most objective—your sense of fairness, your political judgments, your belief that you can “read people.” One large-scale project using the Implicit Association Test didn’t just show that many people carry automatic associations about race, gender, or age; it also showed a weak link between consciously rejecting stereotypes and actually *escaping* their pull in quick decisions.

To work with this, treat your own mind as an environment you’re designing, not a neutral instrument you’re simply using. Just as architects add handrails where people are likely to slip, you can add small structures around moments where your thinking is most likely to tilt: checklists for reviewing candidates, blind review of creative work, or rotating who speaks first in a discussion. These aren’t about mistrusting yourself; they’re about acknowledging that your first impression is one data point, not the final verdict.

A simple way to spot these hidden patterns is to watch where your “gut feeling” clashes with the numbers in front of you. Picture scrolling through two reviews for the same restaurant: one written in polished language, one in broken grammar. Many people unconsciously weigh the fluent review as more credible, even when both report the same facts. Or consider performance feedback: a manager might describe one employee as “a natural leader” and another as “detail-oriented,” then later, those labels quietly shape who gets stretch projects, regardless of actual results.

In group settings, notice whose ideas get written on the whiteboard with names attached and whose don’t. Over time, that simple act can tilt perceptions of who is “driving” the work. Even calendar defaults matter: if meetings always land at times that clash with caregiving or commuting patterns, the people who can’t attend consistently become less central to decisions, without anyone explicitly choosing to exclude them.

Bias literacy may soon be as basic as reading and math. As workplaces lean on AI and metrics, the real skill will be noticing where numbers *hide* value: whose ideas get credit, whose risks are punished, whose errors are quietly forgiven. Like adjusting a camera lens, small tweaks—rotating meeting chairs, reordering speaking turns, varying who sets agendas—can surface different perspectives. Over time, organizations that treat bias as a design problem, not a character flaw, will likely make fairer, more durable decisions.

Treat this as ongoing fieldwork on your own mind. Each surprise—an assumption exposed, a shortcut revealed—is like spotting a new trail on a familiar hill. Over time, you’re not aiming to become perfectly “neutral,” but to widen the paths you can choose from, so more people’s realities fit inside the way you see the world.

To go deeper, here are 3 next steps: (1) Take the free Harvard Implicit Association Test (IAT) at implicit.harvard.edu for at least two categories you heard mentioned (like race and gender–career), then screenshot your results and jot 1 concrete situation at work or home where those biases might show up. (2) Watch Dolly Chugh’s TED Talk “How to be a good-ish person” and, using her ideas, pick one recurring decision you make (like hiring, grading, or feedback) and plug in a simple “bias check” step—e.g., always reviewing names last or using a standardized rubric. (3) Start a 4-week “bias learning playlist” by adding the books *Blindspot* (Banaji & Greenwald) and *Thinking, Fast and Slow* (Kahneman) to your reading app or library holds today, and schedule one 30-minute block each week to read and then discuss one takeaway with a friend or colleague who also listened to the episode.

View all episodes

Unlock all episodes

Full access to 7 episodes and everything on OwlUp.

Subscribe — Less than a coffee ☕ · Cancel anytime