Let’s play a game called “Would you Rather?”

Would you rather have…

- A) a 90% chance at winning $1000; or
- B) $900 guaranteed?

Most people choose option B with little hesitation. It has the same *expected value* (= probability × prize) as option A, but more certainty. There’s no FOMO (fear of missing out); there’s no risk of walking away empty-handed and feeling like an idiot.

How about…

- C) a 20% chance at winning $4000; or
- D) a 25% chance at winning $3000?

This one is a tougher choice, so take your time and think about it.

Did you choose option C? I did, as did 65% of participants in a 1979 study by Daniel Kahneman and Amos Tversky. Did you go with your gut, or did you calculate the *expected value* of these choices? Option C has a better expected value than option D ($800 vs. $750). So if you chose C, then congratulations: you went with the rational choice!

Next, would you rather have…

- E) an 80% chance at winning $4000; or
- F) $3000 guaranteed?

Again, take your time. Pretend you’re on a gameshow. E or F?

F. Obviously. There’s no FOMO and $3000 is nothing to sneeze at. Sure, you could get more with option E, but you could also end up with nothing. 80% of participants in the 1979 study agree.

But is that rational?

No.

The *expected value* of option E is higher: $3200 vs. $3000 for option F. If you were trying to maximize your winnings, you would choose option F. On average, you would be $200 better off on this particular gameshow if you ignored your FOMO and went with option F.

It feels wrong to go with the cold, rational choice though, doesn’t it?

**Decision weighting**

Humans are not rational. They tend to overvalue low probabilities and undervalue high probabilities. Kahneman and Tversky measured these tendencies by playing “Would you Rather?” over and over, with a variety of problems and a variety of people. Here’s what they came up with.

Every person has their own decision weighting curve. These curves vary based on the size of the prize or if they’re referring to losses instead of gains. The shape is usually an S, where the decision weights start higher than the probabilities, then dip below. People overvalue low probabilities and undervalue high probabilities.

### People overvalue low probabilities

Low probability overestimation is evident in gambling. The expected value of any roulette bet is lower than the amount wagered due to those pesky 0 and 00 slices on the wheel. But people play roulette anyway. Yes, it’s possible to win big, but it’s not likely. In the long run, the house always wins.

To explain this, let’s do another “Would you rather”. Would you rather have…

- G) $10 guaranteed; or
- H) a 2.6% chance of winning $350?

If my math is right, this is the choice you make when you put $10 on lucky number 13 in roulette, a 35 to 1 payout. Option G has an expected value of $10. The expected value of placing the bet, option H, is (2.6% × $350 = $9.10). On average, you lose $0.90 every time you bet on lucky number 13. In other words, if you played this same bet 100 times, you’d end up losing about $90. It’s not a rational thing to do.

Yet people do this over and over and over. Happily! Ignoring all the psychological tricks that casinos play, this is because of decision weighting. What casinos are doing here is arbitrage. Gamblers have a higher decision weight than probability on these gambles. So the gamblers get the thrill of imagining a rare victory, and the casino gets that $0.90 per bet, over and over.

Lottery tickets are a similar deal, but with far smaller odds.

### People undervalue high probabilities

As we saw in the “Would you Rather” questions above, most humans prefer a sure thing over a gamble. Even if the gamble has higher expected value and is almost a sure thing.

Another example of this is lawsuit settlements. Let’s say you are the victim in a lawsuit and you have a 90% chance of winning $1 million. The judge may be having a bad day, so there’s a 10% chance that your suit will be thrown out. Would you rather go to court or would you rather take $800,000? Most people would take the sure thing.

There’s an industry that takes advantage of this by “buying” lawsuits. They guarantee a payout to the lawsuit victim that is lower than the statistically expected value but higher than the decision-weighted value. The victim gets a guaranteed payout with no FOMO and the buyer of the lawsuit profits on average. It’s a win-win (ignoring the unnecessary strain on the legal system, of course).

## Further reading

Decision weighting is one of many counter-intuitive concepts covered by the book *Thinking Fast and Slow*. It’s written by Nobel prize winner Daniel Kahneman and is a beautiful intersection of psychology and economics.

Another great book on these concepts is The Undoing Project, which I wrote about in 5 psychological principles from The Undoing Project.

Are you interested on the impact of decision weighting on user experience? Then check out this article by the always-great Nielsen Norman Group: Prospect Theory and Loss Aversion: How Users Make Decisions.

Do you have a “Would you rather” for me or further reading that I haven’t mentioned? Leave a comment below!

[…] month, I wrote about the phenomenon of Decision weighting after reading the book Thinking, Fast and Slow, by Daniel Kahneman. Some friends then recommended […]

LikeLike