Sunteți pe pagina 1din 7

We come from the future

LATEST REVIEWS SCIENCE IO9 FIELD GUIDE EARTHER DESIGN PALEOFUTURE

Why you're probably not as rational as


you think you are — and what you can
do about it
George Dvorsky
11/02/12 2:00pm

When it comes to self-improvement, few people consider their reasoning


skills. Most of us simply assume - and take for granted - that under most
circumstances, we formulate perfectly rational opinions. But according to
an emerging subculture of rationality gurus, there's still plenty of room
for improvement. They believe there are ways we can train ourselves to
make better decisions, as well as increase personal control over our lives,
health, and happiness. Here are a few of their ideas about how you can
become more rational.
/
To better understand rationality and how we can improve upon it, we
spoke to Julia Galef, co-host of the Rationally Speaking Podcast, and the
president and co-founder of the Center for Applied Rationality, a
nonprofit think tank that teaches math- and cognitive science-based
techniques for effective decision-making.

After speaking to Julia, it became clear that rationality is coming to be


seen as a kind of cognitive enhancement - a likely explanation as to why
so many lifehackers and futurists have started to take interest. And as we
also learned, becoming more rational is not as difficult as it may sound.
When it comes to clearer thinking, all we often need to do is make a
minor adjustment.

What is your own personal definition of rationality?

You may have noticed that when people say "That's not rational" they
usually just mean, "I disagree with you."

But to cognitive scientists,


"rationality" means something
specific. It's a set of techniques from
math and decision theory for forming
your beliefs about the world as
accurately as possible, and for
making decisions that are most likely
to achieve your goals.

What would it be like if people were


perfectly rational? Well, our
confidence in a claim would match
the amount of evidence backing it up.
We'd change our minds in response
to good arguments. We wouldn't stay
stuck in jobs or relationships we hate,
or make the same mistakes again and again.

We'd reflect on what we most care about - like our happiness, our friends
and families' happiness, or improving the world - and on the best way to
work towards those goals. And then we'd actually do it, rather than
/
remain in our old ruts.

Of course, none of us are ever going to be 100% rational all the time.
We're only human! We've got limited time, and focus, and energy.

But we can make some powerful improvements. Cognitive scientists like


Nobel prize-winner Daniel Kahneman (author of Thinking Fast and Slow)
have amassed a treasure trove of research on ways that human brains
could be more rational. So far, no one's been using that research to
actually improve people's lives - that's why we founded the Center for
Applied Rationality.

Given that virtually no one would admit to being irrational, how can a
person know they're not thinking rationally?

It's often easier to notice your own irrationality after the fact. I think
we've all had that experience of wondering, "Why did I stay up all night
when I knew I had to be on-the-ball this morning?" or "Why didn't I
leave this awful job months ago?"

Catching irrationality in the moment is harder. Once I started paying


attention, I was surprised at how frequently I noticed myself dismissing
an argument because I didn't want it to be true, or because I felt
defensive, or because it's being made by a pundit I dislike.

So one simple trick that I use all the time now is a thought experiment.
I'll ask myself, for example: "If it was my favorite blogger making this
argument, rather than the pundit I dislike, would I still think it's a bad
argument?" It's a handy way to check whether you're evaluating an
argument rationally or whether you're biased against it for some reason.

At CFAR we teach lots of tricks, many of them as simple as the thought


experiment, to help people notice their own irrationality when it counts.

It's interesting that you bring up biases. Some of these are learned, but
most are an ingrained part of our psychologies. What are some common
biases that get in the way of rationality, and how can people best address
these glitches in their thinking?

/
Brief background: cognitive scientists have discovered that the human
brain has roughly two different ways of forming judgments: intuition and
reflection. (They're also called "System 1" and "System 2.") Our intuition
uses shortcuts and emotional cues, while reflection is what allows us to
plan ahead and to reason abstractly about things like math, logic, and
hypotheticals.

And it's a common misconception that being unbiased means only using
reflection. But in fact, your intuition is invaluable! Without its shortcuts,
we'd go crazy trying to reflect carefully on every single little decision.
And without its emotional cues, we'd be rudderless – we wouldn't know
what we cared about.

So it's more accurate to think of "biases" as cases of imperfectly


coordinating your intuition and reflection. For example: Are you unhappy
in your career, but reluctant to switch? It's worth it to check if you're
being unduly influenced by the sunk cost fallacy, which makes us
attached to things we've already sunk a lot of time or money into, even if
we know we'd be better off quitting.

Are you afraid of your kids being kidnapped if you let them play outside?
That's understandable. But check and see if you're being influenced by
the availability bias, which makes us overemphasize risks that are vivid,
even if they are far rarer than "mundane" risks like car accidents.

Are you feeling moved to donate some money to save lives? That's great.
But you might be able to do far more good, with the same amount of
money, if you're aware of the fact that your intuition is quantity
insensitive. That is, our intuitive judgment feels roughly as happy about
saving 1000 lives as 100, so we don't automatically pay much attention to
the quantity of good we're doing. In fact, studies have shown that people
are less inclined to help a large group of children than a single child.

So rationality doesn't mean ignoring our feelings – it means


understanding what's influencing those feelings, and having reflective
tools to complement them. That gives us more control over the outcomes
of our decisions for our own happiness, our families' happiness, and the
good we can do for the world.

/
If I could get a bit epistemological for a moment, and setting aside our
psychological and emotional tendencies, how can we meaningfully talk
about being rational knowing that we're always making decisions with
insufficient information? Aren't we just fooling ourselves that we're
being rational, when in reality we're no more or less rational than
someone working off different knowledge?

Rationality is only defined with respect to the information you have. So if


you and I have different information, we could both reason rationally and
still end up at different conclusions. But given the information you have,
there are more and less rational ways to use it.

And we'll always be making decisions with insufficient information, but


there are different degrees of "insufficient information," right? A
randomized experiment with a sample size of 100,000 is better evidence
than a few anecdotes, which is in turn better evidence than just making
something up. They're all imperfect but to different degrees.

Isaac Asimov had a typically punchy way of putting it: "When people
thought the earth was flat, they were wrong. When people thought the
earth was spherical, they were wrong. But if you think that thinking the
earth is spherical is just as wrong as thinking the earth is flat, then your
view is wronger than both of them put together."

The futurist community, including mindhackers, transhumanists and


Singularity types, has been an early adopter of improving rational
thinking. Why do you suppose this is?

Partly it's just because futurists tend to be early adopters, period.

But I do think it goes deeper than that. Futurists are especially interested
in what's possible, and they're excited by the powerful new capacities
technology affords us – capacities to relieve suffering, make people
happier, and make new discoveries.

That's what rationality is. It's a mental technology, a way of affording


ourselves more control over the outcomes of our decisions, so that we can
better pursue our happiness. Personally, I find that just as inspiring as

/
the physical technologies that afford us more control over our
environment.

Why do futurists have such a fascination with Bayesian rationality in


particular?

When people refer to "Bayesian rationality" that's just a more precise


way of describing rationality, as it's officially defined in cognitive
science. "Bayesian" refers to Bayes' Rule, which is a theorem in
probability theory that answers the question, "When you encounter new
information, how much should it change your confidence in a theory?"

And why is it more popular among futurists? The connection there, if I


had to guess, is that Bayes' Rule is used a lot in machine intelligence. So a
lot of futurists who are familiar with computer science have developed a
respect for it.

Earlier you mentioned the Center for Applied Rationality. What's your
mission statement, and how can your organization help people become
more rational?

CFAR is an evidence-based nonprofit, founded to give people more


understanding and control of their own decision-making. We take
research from probability theory and cognitive science and turn it into
learnable skills, such as making accurate predictions, reshaping our
motivations, and avoiding self-deception.

It's easier to learn new habits in a group, rather than alone reading
articles online, so we've been holding immersive 3-day rationality
training workshops. Our next is November 16-18, in the Bay Area, with a
special focus on how rationality is useful for entrepreneurs.

It's CFAR's view that this is the most valuable thing we can do for our
future: training ourselves to think rationally about high-stakes decisions.

That's because the challenges facing us in the modern world, such as


weighing the risks and benefits of new technologies, or figuring out how
to stave off epidemics or climate change, are far more complex than

/
anything the human brain evolved to solve. And they're decisions we
can't afford to approach with wishful thinking, blind ideology, or any
other kind of irrationality.

Top Image: Lyao/Shutterstock. Inset image via Julia Galef.