Not really a paradox, but a puzzle in Probability theory.

Suppose I have 2 envelopes. I put $100 into the first, and either twice as much ($200) or half as much ($50) into the other, with equal probabilities (0.5 each). I give you the first envelope (with $100); you may either keep it or switch. What should you do?

Well, if you stick with your envelope, you always get $100. But if you switch, you get $50 with probability 0.5 and $200 with probability 0.5, so you expected profit is $125. Clearly you should switch, for an expected gain (just for switching) of $25.

That's not the paradox. This is...

Now I take my 2 envelopes; I put some random sum $N into the first, and $(2*N) into the second. I toss a coin (independently of N, of course) and give you the first envelope if it comes up heads, or the second (if it comes up tails). I let you examine the envelope, and inside you find $M (M is either N or 2*N). I offer you the possibility of switching.

You definitely want to switch! By the above logic, the other envelope contains $(M/2) with probability 0.5 and $(2*M) with probability 0.5, so if you switch you expect to gain $M/4. But the same reasoning holds even if I don't let you see the sum of money in the envelope! (it's just a random number, after all, and your decision is always to switch, no matter what).

So given the above 2 envelopes, which are indistinguishable, you always want the other envelope.

If that isn't weird enough, consider what happens after you switch: since you didn't look inside the first envelope to decide to switch, you don't need to look inside the second to decide to switch again. And every time you switch, you expect to "gain" more money (in an exponential series!).

I don't get this...

I understand the mathematics, but I don't understand how this makes sense in a real world example. You can't expect to gain $25 for switching, because you have a 0% chance of gaining $25. You either gain $100, or lose $50. If we perform a long series of envelope switching exercises, and I switch every time, then I'll average a $25 gain over the duration. But that's not what we're doing. On a one-time-shot basis, you have just a 50/50 chance of coming away with more money. On a risk vs. reward basis, you're an idiot for not switching, though.

I'll tell you why I would switch. I've just been given $100. Regardless of what I do, I'm coming away with more money than I showed up with, because there's some moron giving away free money as long as you play some weird envelope game with him. Might as well go for the $200.

By the above logic, the other envelope contains $(M/2) with probability 0.5 and $(2*M) with probability 0.5, so if you switch you expect to gain $M/4.

One of the problems with this approach to game theory is that although "statistically" one can "expect" to gain $M/4, no such real alternative exists in the game. This is the same situation as town in which half the population makes $10,000/year, and half make $110,000/year. The mean income is $60,000/year, but NOBODY actually has that income! It's a situational failure of the descriptive statistics.

In fact, any situation where one can stand either to gain or lose by "choosing the other envelope" versus keeping the one you have can be *described* mathematically, but not "solved" that way in real life. It ultimately comes down to whether the individual is a risk-taker or prefers a sure thing.

A rather related thingy I thought up:

Let us take the two-envelopes situation and make it into a game, where you can keep switching over and over, with a new set of envelopes every time (which are based on your current envelope). You start with $128. At the beginning, the two unknown envelopes, therefore, contain $64 and $256. As already stated, we expect an average gain of 1/4 our current value every time. So, keep switching a large number of times, and you'll be in the money!

We could also model the game as follows: You have a "ladder" of money values, with each rung two times the one below it. Each time, you are asked, "Hit or stay?". Stay means that you are done playing the game, and you wish to take your money. Hit means that you have a 50% chance of going up on the ladder, and a 50% chance of going down on the ladder. However, after a large number of hits, don't we expect an even distribution of ups and downs? If that's the case, we won't be really going anywhere on the ladder. So, we don't have any expectation of gain or loss.

Quite contradictory.

Even more interesting: What if we had a 60% chance of losing half (moving down on the ladder), and a 40% chance of doubling (moving up on the ladder)? According to our first model, we should have an average exponential gain of 10% every time, but the second model says that our money would exponentially decrease on average!

So, which model is correct? :-)

This "paradox" can be solved can be solved with some easy probability calculations. Taking the case of an envelope with $100, and a second envelope with either $50 or $200:

You start with $100. If you switch, you have probability 0.5 of going to 50, and 0.5 of going to $200, so a net "gain" of $25. Simple. But can you win by switching again? No! :-). You know for certain that whether you went to $50 or $200, the first envelope contains $100. So you oscillate, by switching over and over again, between a certain $100 and a 50/50 chance of $50 or $200. It is only worthwhile switching once; switching a second time (or any even number of times) is not a smart move.

Now consider the case of two envelopes, one with $N and the other with $2N. You don't know which one you get at first, so it it worth switching? Call what you have in the first envelope $M (which is either $N or $2N). There is probability 0.5 that $M is $N, and probability 0.5 that $M is $2N. So $M = (0.5)($N) + (0.5)($2N) = $(3N/2)--what you currently have. If you switch, there are possibly two things you have done. There is 0.5 probability that you had $N and switched to $2N, and there is 0.5 probability that you had $2N and switched to $N. While this seems like the earlier case in which you start with an amount x and have a 50/50 chance of doubling or halving it, this case is different. You are either doubling the amount $N, or halving the amount $2N--these are different amounts! The amount you have on switching is the same as not switching: ${3/2)N. It makes perfect sense if you keep in mind that since we don't know which of two envelopes we have, how can we be sure we are benefiting from switching?

The reason it seems like a paradox is because we are tempted to use this value "$M" as if it were a known value. But it can be one of two different values, $N or $2N, and we must take this into account.

Note: Cf Orange Julius, my calculations reflect only probability values; and that when I say $M equals x, it actually averages out to x only after many tries. This way of writing should not affect my logic :-).

Log in or register to write something here or to contact authors.