Vsauce! Kevin here, and I

have a simple coin-flipping game, that requires no skill, has no catch or trick, and can lead to infinite wealth.

The thing is… nobody really wants to play it. Why? How is it possible that an incredibly easy

game with infinite upside causes virtually everyone to react with a massive yawn? To play this game, we’ll turn to the most

rational, calculating man in history: long-time friend of Vsauce2, Dwight Schrute. You walk

up to the table to flip a coin. Your prize starts at $2. If the coin flip results in

FALSE, the game is over and you win $2. If it lands on FACT, you play another round and

your prize doubles. Every time you get a FACT, you keep playing and the prize keeps doubling

— from $2 to $4 to $8 $16 $32 $64.

$128, so on and so on… forever. But as soon as you get a FALSE, you are done

and you collect your winnings. So if you hit a FALSE in the third round, then your prize

is $8. If your first FALSE comes in Round 14, you’d walk away with $16,384. No matter

how unlucky you are, you’ll never win less than $2. If things go really well… then

things could go really well. Now that you know the potential payoffs, how

much would you be willing to pay to play this game? $3? What about $20, $100? The winnings

could be infinite so the question is: how much is a chance at infinite wealth worth

to you? We can determine the precise answer, but first

we need to know the game’s expected value, which is the sum of all its possible outcomes

relative to their probability. That determines the point at which we choose to play a game

— or, in the real world, the point at which we decide to take out insurance on our house

or a life insurance policy.

If our risk is less than our likely reward, we should play.

If we’re paying too much relative to what we’re likely to get out of playing, then

we should not play. Here’s the expected value of Schrute. You’ve got a 50/50 chance of losing on your

first flip and heading back to the beet farm with $2. With a probability of ½ and a payoff

of $2, your expected value in the first round is $1. The probability of winning two rounds

is ½ * ½, or ¼, and your prize there would be $4. That’s another $1 in expected value.

For three successful flips, it’s ½ ^3 — or ⅛ — times $8. Another dollar. 1/16 * 16…

1/32 * 32… 1/64 * 64… For n rounds, the expected

value is the probability (½)^n * the payoff of 2^n — so no matter the value of n, the result will be 1. The

expected value of the game is 1 + 1 + 1 + 1 + 1… forever. Because each round adds

$1 of value no matter how rare the occurrence might be. The expected value is infinite. And there’s our paradox. Because, you’d

think a rational person would pay all the money they have to play this game.

Mathematically,

it makes sense to pay any amount of money less than infinity to play. No matter what

amount of money you risk, you’re theoretically getting the deal of a lifetime: the reward

justifies the risk. But nobody wants to do that. Who would empty their bank account to

play a game where they know there’s a 75% chance they walk away with $4 or less? It’s confusing because expected value is,

mathematically, how you determine whether you’ll play a game. Look, if I offered you

a coin-flipping game where you won $5 on heads and lost $1 on tails, your expected value

of each round would be the sum of those possible outcomes: (50% chance * +$5) + (50% chance

* -$1).

Half the time you’ll win $5, half the time you’ll lose $1. In the long run,

you’ll average +$2 for every round you play. So paying anything under $2 to play that game

would be a great deal. When the price to play is less than your expected value, it’s a

no-brainer. And since the expected value of the Schrute

game is infinite, paying anything less than infinite money to play it should also be a

no-brainer. But it’s not. Why? The thing that’s so interesting about this

game is how the math conflicts with…. actual humans. Enter: Prospect Theory. An element

of cognitive psychology in which people make choices based on the value of wins and losses

instead of just theoretical outcomes. The reason people don’t want to empty their

pockets to play this game despite its infinite gains is that the expected marginal utility

— its actual value to them — goes down as those mathematical gains increase forever.

This solution was discovered a few years ago.

A few hundred years ago. In 1738, Daniel Bernoulli

published his "Exposition of a New Theory on the Measurement of Risk" in the Commentaries of the Imperial Academy

of Science of Saint Petersburg — and what we now call the St. Petersburg Paradox was

born. Bernoulli didn’t dispute the expected value of the St. Petersburg game; those are

cold, hard numbers. He just realized there was a lot more to it. Bernoulli introduced

the concept of the expected utility of a game — what was, until the 20th century, called

moral expectation to differentiate it from mathematical expectation.

The main point of Bernoulli’s resolution

was that utility, or how much a thing matters to you, is relative to an individual’s wealth

and that each unit tends to be worth a little less to you as you accumulate it. So, as an example, not only would winning

$1,000 mean a lot more to someone who’s broke than it would to, say, Tony Stark, but

even winning $1 million wouldn’t affect the research and development at Stark Industries. And there’s also a limit on a player’s

comfort with risk, with John Maynard Keynes arguing that a high relative risk is enough

to keep a player from engaging in a game even with infinite expected value. Iron Man can

afford to lose a few billion. You probably can’t. And value itself is subjective. If I won 1,000

peanut butter and jelly sandwiches, I would be THRILLED.

If someone allergic to peanuts

won them, they’d be… less thrilled. So. Okay, okay. Given all this, how much can

YOU afford to lose in the St. Petersburg game? How badly do you want to play? Bernoulli used

the logarithmic function to come up with price points that factored in not only the expected

value of the game, but also the wealth of the player and its expected utility. A millionaire

should be comfortable paying as much as $20.88 to flip Schrutes, while someone with only

$1,000 would top out at $10.95. Someone with a total of $2 of wealth should, according

to the logarithmic function, borrow $1.35 from a friend to pay $3.35.

Ultimately, everyone has their own price that

factors in their wealth, their desires, their comfort with risk, their preferences, how

they want to spend their time, what else they could be doing with their money, their own

happiness… And the thing is… this game can’t even

exist. Economist Paul Samuelson points out that a game of potentially infinite gain requires

the other party to be comfortable with potentially infinite loss. And no one is cool with that. So if the important elements are variable

and the game can’t exist, what’s the point? The St. Petersburg Paradox reminds us that

we’re all more than math. The raw numbers might convince a robot that it’s a good

idea to wager its robohouse on a series of coin flips, but you know deep down that’s

a really bad idea.

Because you aren't an expected value calculation.

You aren't a logarithmic function. The numbers are a part of you and help you live your life.

But in the end, you are… you. Fact. And as always, thanks for watching..