logo 12min

Books, Audiobooks
and Summaries

×

What if you could read 3 books per day?

Enter your phone number, and you'll receive a link from us to download 12min.


I don't want to learn more, thanks!
~

The (Honest) Truth About Dishonesty Summary

Quick Summary: “Amusing Ourselves to Death explores whether Aldous Huxley’s fictional and dystopian vision of the future described in Brave New World hasn’t already turned into the reality of our TV-dominated and image-centered present. Spoiler alert: it has, and TV and reality shows are our pleasure drugs, our very own soma.

The (Honest) Truth About Dishonesty Summary

Who Should Read “The (Honest) Truth About Dishonesty”? And Why?

If you consider yourself a good and honest person, then let Dan Ariely and his exceptional book, The (Honest) Truth About Dishonesty, burst your bubble: you are not, have never been, nor will ever be one.

If you want to learn why and you are a psychology student, then this book is a good place to start – but be sure to check out Dan Ariely’s other books as well (we’ve summarized all of them just for you: The Upside of Irrationality, Irrationally Yours, Payoff, Predictably Irrational).

But please read this one especially if you are a student of economics, because, well, you are taught wrong about practically everything.

The (Honest) Truth About Dishonesty Summary

What the Enron Scandal Says About Your Deceitfulness

Well, you all know the basic story by now, but it’s worth repeating yet again because we’re about to examine it in a wholly different manner.

Enron Corporation – voted “America’s Most Innovative Company” for six consecutive years – was an American energy company based in Houston, Texas, with claimed revenues of over $100 billion and a 30,000 staff in 2000.

And then, by the end of the following year, Enron had to file for bankruptcy.

What happened in the meantime?

Well, in October 2001, it was publicized that through a series of pretty creative accounting tricks (accounting loopholes and special purpose entities, mostly), coupled with poor financial reporting (and, to some extent, disinterest for a proper one from the now-defunct firm Arthur Andersen), Enron had managed to hide billions of dollars in debt from failed projects and deals from its investors.

The main culprits?

The smartest guys in the room, the “three sinister C-level architects”: Kenneth Lay (the Chairman), Jeffrey Skilling (the CEO), and Andrew Fastow (the CFO).

The problem?

Well, let’s just say that this may not have been the whole story.

And that’s exactly the thought that crossed the mind of Dan Ariely – the author of The Honest Truth About Dishonesty – after a discussion with a close friend of his, a one-time consultant for Enron.

He, in the words of Ariely, “hadn’t seen anything sinister going on… Even more surprising, he also told me that once the information was out, he could not believe that he failed to see the signs all along.”

“I started wondering if the problem of dishonesty goes deeper than just a few bad apples,” writes Ariely. “I also wondered whether my friends and I would have behaved similarly if we had been the ones consulting for Enron.”

You already sense the answer, don’t you?

Calling All Art Enthusiasts

OK, but that’s Enron!

The larger and more complex a company is, the more difficult to see what’s going on and, let’s face it, more tempting to do something dishonest.

After all, if Enron has revenues of $100 billion, then would anyone notice $50 missing?

OK, then, another example.

And this one involves the gift shops of the John F. Kennedy Center for the Performing Arts in Washington, D.C.

All you need to know for now is that these gift shops were run much like lemonade stands, meaning there were no cash registers, but merely cash boxes; and they did quite well, earning the Center about $400,000 on a yearly basis.

However, each year, more than a fourth of that disappeared.

The manager tried to find the culprit and, after suspecting a young employee for a while, he organized a sting operation with the help from a detective; soon after, the young man was caught with marked bills in his pocket.

Problem solved, right?

Well, far from it: money and merchandise still went missing, even after the man was caught; in a way, catching him barely made any observable difference as to the amount of stolen cash.

The manager saw no way out but to set up an inventory system with price lists and sales records.

Surprise, surprise: the thievery stopped.

Apparently, there wasn’t just one bad apple: everybody was stealing.

Oh, we forgot to tell you who everybody is!

The sales force at the gift shops were “elderly, well-meaning, art-loving volunteers,” “mostly retirees who loved theater and music.”

The moral?

In the words of the manager: “We are going to take things from each other if we have a chance… many people need controls around them for them to do the right thing.”

Big Brother Is Watching

In a way, both Enron and the gift shops boil down to a simple problem: whether we’re capable of resisting stealing something when nobody is watching.

And this is a problem as old as time, going back all the way to Plato’s Republic.

In its first chapter, Plato’s brother Glaucon asks Socrates whether a man can be virtuous if he knows for a fact that his crimes would be undetectable.

To illustrate his point, he tells the story of the Gyges ring, found by one of King Gyges’ ancestors, a mere shepherd.

When properly adjusted at the finger, this ring was so powerful that it granted invisibility to the one wearing it.

Of course, Gyges used it to infiltrate the king’s court, seduce the queen and eventually kill the king and crown himself as the next one.

(Tolkien much, by the way?)

Now, Socrates doesn’t really answer Glaucon’s question; he says, in many, many more words, that a man can be virtuous only if the state is, aka if he knows that the state is watching him/her.

And, you know what?

(Plug your ears, civil libertarians)

He was right!

“To me,” writes Ariely, “Plato’s myth offers a nice illustration of the notion that group settings can inhibit our propensity to cheat. When we work within a team, other team members can act informally as monitors, and, knowing that we are being watched, we may be less inclined to act dishonorably.”

Don’t believe us?

There are experiments, boy!

Apparently, just putting an image of eyes above an empty cash box asking for donations makes people donate much more than in its absence (or, if say, the image is of flowers).

Why?

Well, let’s see why.

The SMORC and Everything That’s Wrong with It

If you know anything about Dan Ariely, then you already know that he’s a behavioral economist who thinks (like many other current economists) that rational economics is based on a serious study of human nature as much as the flat earth theory is based on a study of the Earth’s circumference.

And that’s not an exaggeration!

Just think of the SMORC, aka the Simple Model of Rational Crime.

Its origin is pretty mundane.

Gary Becker, a Nobel-Prize-winning economist, once was running late for a meeting and couldn’t find a legal parking spot. So, he opted to park illegally.

Why?

Because he decided that the cost of missing the meeting was greater than the cost of being caught, fined, and possibly towed.

In Becker’s mind, that must be exactly what’s happening in the mind of most criminals.

The problem is – we now know for a fact that it doesn’t work that way.

Take this simple experiment.

Two groups are tasked with solving 20 math assignments and told that they would be given $5 for each correct solution.

There’s a catch, of course: the first group’s worksheets are reviewed, and the second one’s claims are taken for granted.

As expected, the unreviewed tests were fascinatingly better: people claimed, on average, that they had solved 6 assignments when the other’s group actual average was 4.

Now, the stakes were upped: what if they are given $10 or $15?

According to SMORC, they should have claimed to have performed better.

But that didn’t happen.

In fact, nothing changed.

The most interesting part?

Nothing changes even if the unreviewed group is given the opportunity to shred their tests, thus eliminating the chance of being caught.

Two Types of Motivation

The reason why the SMORC is wrong is pretty simple: the motivation to lie is merely one aspect of our character.

“Honesty and dishonesty,” clarifies Ariely, “are based on a mixture of two very different types of motivation. On the one hand, we want to benefit from cheating (this is the rational economic motivation), while on the other, we want to be able to view ourselves as wonderful human beings (this is the psychological motivation).”

In other words, we want to both have our cake and eat it too.

Rationally, this is impossible.

However, as Dan Ariely has demonstrated to us over and over again, we are not rational beings.

And our comprise is surprisingly simple.

“Basically,” concludes Ariely, “as long as we cheat just a little bit, we can have the cake and eat (some of) it too. We can reap some of the benefits of dishonesty while maintaining a positive image of ourselves.”

This is what Ariely terms the “fudge factor theory.”

In essence, it’s about balancing things right.

The psychological motivation – our innate wish to see ourselves as wonderful human beings – is just as powerful as the rational economic drive: to earn more with less.

And this is exactly why certain forces such as the two described above (“the amount of money we stand to gain and the probability of being caught”) influence our decisions much less than one might think they do.

At the same time, other forces influence us more than we might expect.

There’s a whole list of the latter: “moral reminders, distance from money, conflicts of interest, depletion, counterfeits, reminders of our fabricated achievements, creativity, witnessing others’ dishonest acts, caring about others on our team, and so on.”

Let’s see how the first two of them work in practice.

May the Ten Commandments Be Engraved in You

Let’s add another twist to the experiment with the two groups assigned with solving a math test.

The first one – if you read carefully – was informed that their test would be reviewed; the second one that there’s no mathematician in the house, so the guys shelling out the money ($5 for a correct solution) would have to take their word for it.

The twist is this: the second group was now divided into two groups, one of which was asked to reread the Ten Commandments before the test; the other one was instructed to recall ten random books studied in high school.

The last group cheated as much as they did previously; however, the group which recalled the Ten Commandments before solving the test didn’t; at all.

Apparently, it doesn’t even matter if you are religious or not. What matters is the feeling that there’s something more to cheating than your own benefit.

Here’s another example for the atheists who can’t empathize with the above one.

A business consultant (actually, a comedian in disguise) comes to your school and gives you a lesson on how to cheat and get away with it.

Do you:

a) consider the advice helpful;
b) have an uneasy feeling that a business consultant is telling you this; or
c) both of the above.

For most people, it’s c, aka a clear example of seeing both motivations at work.

“These advices are good for you,” says the rational voice in your head. “Wait a second,” a second one, psychologically-motivated voice joins in, “isn’t it bad to do this?”

One Step Removed from the Money

Another experiment shows precisely why the more distant you are from the money, the less tempted you are to do something bad.

The experiment goes something like this.

In an MIT dorm, half of the communal refrigerators are stacked with six packs of Coca-Cola; the other half with paper plates containing six $1 bills.

Of course, all of the students know that neither the Cokes nor the dollar bills belong to them; also: they all know (or have a vague idea) that the actual worth of the two is pretty much the same (you can buy a Coke for $1).

What do you think happened after three days?

Yup: all of the Cokes were gone, but not one of the dollar bills was touched. Even though nothing stopped the students from taking a dollar bill, walking over to the nearby vending machine and getting a Coke and some change; for some reason, no one did.

And that reason is psychological: “the fudge factor” increases the greater the psychological distance between the dishonest act and its consequences.

In plainer words, it is easier for you to rationalize that you are a good person if you need just one step of rationalizations to do that (taking Coke); however, if it takes you more than one (taking the money and buying Coke), then it’s more difficult to digest that psychologically.

It’s basically the difference between murdering someone with malice and murdering someone in a state of affectation. There are two different punishments for that because in the first case, you had the time to change your mind.

“From all the research I have done over the years,” concludes Ariely, “the idea that worries me the most is that the more cashless our society becomes, the more our moral compass slips.”

Is There a Solution?

Ariely’s book abounds with examples and experiments such as the few described above, each more interesting than the other.

However, for the purposes of this summary, we decided to limit ourselves to the few which most directly say what should one do if he/she wants to limit dishonesty.

In a way, it’s nothing that we don’t know intuitively already but is everything most rational economists and politicians forget when setting up our systems (hence, the crises).

Here’s a secular (and elegant) example which speaks volumes about resetting.

A woman in South America noticed that her maid had been stealing a little bit of meat from the freezer every few days.

She was rich enough to not care about this too much, but nobody likes to be cheated (especially if that leaves you with not enough meat to make dinner sometimes) so she decided to do something about.

And no – she didn’t organize a private detective or fired the maid.

She just put a lock on the freezer and told the maid that she suspected someone is stealing from the freezer; then, she gave her a key to the lock and a small financial promotion, adding that the two of them are the only ones who now have the key and must be watchful.

Of course this worked!

Because by putting a lock on the freezer the woman both put a check on the maid’s temptations and created a psychological distance.

Now, stealing the meat would mean making two steps (unlocking the freezer and then taking some meat), and that’s not easy to rationalize.

Finally, by singling out the maid as the most trustful person, the woman put a moral burden on her, the one the Ten Commandments put on the math-testers.

And that made all the difference.

Key Lessons from “The (Honest) Truth About Dishonesty”

1.      Honesty and Dishonesty are Based on a Mixture of Two Contrasting Motivations
2.      You Constantly Rationalize Your Dishonesty to Make Yourself Look Good (to You)
3.      The More Distant the Dishonest Act, the More Difficult to Cheat

Honesty and Dishonesty are Based on a Mixture of Two Contrasting Motivations

According to most economists, people cheat only after making a rational cost/benefit analysis. In other words, if there’s a low chance of being caught and a high benefit from a dishonest act, then it’s only human to cheat.

However, behavioral economists have discovered that it doesn’t work that way. Apparently, we do do this cost/benefit analysis, but only in combination with a psychological assessment of the actions. In plainer words, we only cheat when and to the extent we can rationalize it.

You Constantly Rationalize Your Dishonesty to Make Yourself Look Good (to You)

Mixed, this motivation works like this.

If you are given a test with twenty questions and promised to be given $5 for every correct answer, but told that there’s nobody to review your answers – then you’ll probably cheat.

And you’ll cheat no more even if they gave you $50 dollars for a correct answer and offered you an opportunity to shred the test afterward, you’ll cheat.

Why?

Because cheating too much can put a strain on your brain; and you want your brain to process your cheating in a certain “I did it, but not as much as I could” manner.

Because this both allows you to cheat and makes you a good person.

The More Distant the Dishonest Act, the More Difficult to Cheat

The more obstacles are there between you and the dishonest act, the bigger the strain on your brain.

That’s why you’ll think twice before stealing someone’s money but only once before stealing someone’s Coca Cola.

Politicians and economists – please, use this knowledge!

Like this summary? We’d like to invite you to download our free 12 min app for more amazing summaries and audiobooks.

The (Honest) Truth About Dishonesty Quotes

The more cashless our society becomes, the more our moral compass slips. Click To Tweet We found that knowing that others will benefit from our actions does indeed motivate people to cheat more. Click To Tweet The question, then, is whether the only force that keeps us from carrying out misdeeds is the fear of being seen by others. Click To Tweet The more creative we are, the more we are able to come up with good stories that help us justify our selfish interests. Click To Tweet Recognizing our shortcomings is a crucial first step on the path to making better decisions, creating better societies, and fixing our institutions. Click To Tweet

Final Notes

The (Honest) Truth About Dishonesty is everything you’d expect from a book written by Ariely (or any other popular behavioral economist nowadays): thought-provoking, counterintuitive, and filled with experiments.

“Required reading for politicians and Wall Street executives,” says a Booklist review.

Required reading for everyone, we add.

   Take this summary with you and read anywhere! Download PDF: