Episode 54: Confirming Our Identities by Tricking Ourselves
August 4, 2025
How often do you think you are fooling yourself without being aware of it?
After hearing a striking study on the You Are Not So Smart podcast, I had to jump in and reflect. The episode explores how even the most rational, mathematically inclined people can twist numbers to support their identity-driven beliefs, often without realizing it. I unpack what that means for critical thinking, why simply informing people rarely works, and how it connects to behavior change, public health campaigns, and personal bias. There’s no tidy conclusion here. Just a set of tough, fascinating questions about how we form our beliefs and how hard we work to keep them intact.
Transcript
So I was just listening to this episode of a podcast called You Are Not So Smart.
First of all, if you are not familiar with this podcast, if you like the kind of stuff that I do here, I'm sure you would probably like the kind of stuff that he does.
And his podcast is a much longer form.
And, I mean, he's a well-researched, well-regarded author and psychologist and all the rest of it.
Really interesting stuff.
But I practically ran home to record this because I was really jazzed up about this particular episode.
I will link to it in the show notes description.
It's episode 317.
Now, this sort of ties into a topic that I jumped into somewhat on this show.
But there is so much more to dive into.
There might be more of them in the future.
But anyway, there was an episode I did of this show, which I believe I called something like Lies, Damn Lies and Statistics.
The general idea of that show, well, it dove into a particular aspect of this.
But the overarching point is really around studies that show certain things or how math can be manipulated and statistics can be manipulated to back up certain points versus other points.
A well-known thing.
Again, there's this kind of common phrase in the cultural zeitgeist of lies, damn lies, and statistics.
Anyway, on this episode of this podcast, the host was going through a particular study that I found super interesting.
A lot of his show, and particularly in the early years of his show, it's been going on for quite some time, as well as the books he has authored.
A lot of what he's interested in is bias and how your brain kind of tricks itself into thinking certain things, even when they're not true.
So, anyway, this particular – he was going through a study, and again, I'm going to link to it.
So, if you're interested in the full version of this, go take a listen.
I'm going to go through it kind of quick.
The general idea was there was a study conducted – I think this was quite some time ago.
I don't believe this was recent.
They essentially found – they tested people to see if they were good at math.
Then they asked them some questions about politically where they fall.
Are they more liberal, more conservative, that kind of thing.
And then, for the people who showed that they were good in math, that they really were strong math skills, they first presented them with some questions and some data around a fairly neutral topic.
The topic, I believe, was something along the lines of some kind of face cream and whether or not it made face – your skin better or worse.
And there was some numbers attached.
And the general idea was that people who were strong in math could reason their way through the data to come to the correct conclusion, which was somewhat misleading.
If you just listened to the numbers in the study, at face value, it sort of sounded like they supported one conclusion.
But if you drill in and if you were able to calculate based on the data that they provide, you were able to find out the other conclusion was actually the correct conclusion.
So, people – again, in summary, people who were good at math were able to reason their way through this fairly neutral topic of whether a face cream was helping or hurting with skin conditions or something like that.
So, they then took those same people – I believe the same people, unless they split them into groups, but either way, they took others – they took people who were strong in math.
I can't remember if it was the same people or a different group, but whatever.
Again, people who were strong in math.
And they asked them a similar question, but one that was more politically charged.
So, for instance, might have been a question about gun control.
And the data had supported one side of the argument or the other, whether or not gun control was or wasn't effective, I believe.
But again, the data was set up to be somewhat misleading, where you would need to calculate through the numbers to figure out which the correct mathematical conclusion was based on the data.
What they found was that where in the first part of the study, where it was a neutral topic and people with strong math skills came to the correct mathematical conclusion, as soon as it became politically charged and tied to one's identity about things,
those same people with strong math skills would manipulate their own math and the numbers presented to support whatever preconceived conclusion they probably had based on sort of their political alignments initially.
So, you know, the TLDR here is that when smart people are asked to look at things objectively, they're typically able to do so when the topic is not something that challenges their identity.
And then as soon as it does challenge their identity or they would, you know, sort of be biased in one direction or the other, those same smart people will use all of those smarts to enhance their own biases.
I just found, I had never heard of this study before.
I found this incredibly, maybe not like, oh my God, I had no idea.
But the fact that they were able to study this in a very systematic way really shows our own biases towards things.
And it got me thinking about some other topics, ones that sort of intersect with this.
At a base level, one of the conclusions that I take away, at least from the summary of this study as presented,
is that we as people, we are very prone to change our opinions based on data, but only for things that we don't really care about.
And I'm trying to think through in my head whether, like, is this a sad thing?
Like, is this a sad conclusion to realize about ourself as people?
And I think this also crosses into that, you know, there's this misconception that was used for a long time and still gets used today,
where there's a prevailing thought that, oh, if you just inform people that then they'll draw the right conclusions, right?
For example, this was one of the thoughts that went into a bunch of, like, anti-smoking campaigns in, like, the 80s and 90s.
The general idea was, well, people just don't know how bad it is for you.
If we just tell them how bad it is, they'll stop.
And that sort of thinking, there's been many studies along the way that have shown evidence that it's pretty conclusive at this point,
that that's not true.
Like, just presenting people with information doesn't do enough.
Particularly not for topics that someone cares about or ties to their identity.
And it's particularly not all that helpful for changing behavior.
You know, in the case of, like, smoking, ultimately what they're trying to do is change behavior.
Not just opinions, but behavior.
And information alone is not enough.
And this study, I think, lines up with that.
Like, there's a Venn diagram there where there's a lot of crossover.
It's not exactly what this study was showing, but I think there's an intersection point there.
So, I was thinking about that, and I was thinking about this other piece, which is, you know,
we're willing to change our mind for things that we don't care about all that much.
But as soon as it's something that we care deeply about, or that we, or that probably more psychologically speaking,
not just care about, but that we tie to our identity.
As soon as we tie something to our identity, we will actively manipulate everything around us.
The data, behaviors.
We will echo chamber and bias our way to supporting our foregone conclusions.
And it just makes me wonder, what is the right solution there?
And it brings up other questions around, because you start to question, I question these things
as I go through, you know, when I have something and someone is vehemently opposing whatever it
is I'm thinking about, I do try to take a minute as often as I can and think, am I sticking
to this because it's actually correct, or am I sticking to this, or am I finding reasons?
Am I cherry-picking information to support my conclusions?
It doesn't always work.
And sometimes I'm on one side of the fence or the other, but at the end of the day, part
of what this study is showing is that we might not even know.
We don't really have the tools in our minds to naturally even make the evaluation as to
whether or not we are manipulating ourselves.
Anyway, I found it very interesting.
I don't have a particular conclusion here.
It just brought up a lot of offshoot topics in my head.
But so, I would strongly suggest that you go take a listen to this episode.
I loved it.
The study was super fascinating to me.
If you like that kind of stuff, definitely check out his podcast overall.
It goes into so many biases like this and, you know, just a really great overall podcast.
I've listened to it for years, you know, a little on and off, like not every single
episode, but as much as I can.
Really recommend it.
Check it out.
If it sounds interesting to you at all, definitely at least listen to that episode.
And I'd love to hear what you think.
I'll see you next time.