wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> hard >> HARD: ENVELOPE GAMBLE
(Message started by: srowen on Jul 28th, 2002, 7:38pm)

Title: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 28th, 2002, 7:38pm
There is a flaw in the analysis presented in the problem. After you pick, there is a 0.5 chance that you picked the envelope with X and that switching gets you 2X. There is a 0.5 chance that you picked the envelope with 2X though (*not* X), and switching leaves you with X.

It's not valid to use the same X for both cases; that's the catch.

Your expected gain/loss from switching is:
0.5(2X-X) + 0.5(X-2X) = 0

...which is what you'd expect intuitively! So there is no expected gain for switching, and no loss. You can switch if you want.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Alex Harris on Jul 29th, 2002, 2:24pm
Actually I think its a bit more fundamental than that. The reasoning stated in the problem is just as valid as the reasoning of using the same X for both computations as you did. The problem is that we're implicitly drawing uniform random numbers from an unbounded distribution. The "payoff" for this game doesn't converge absolutely. By choosing different ways of analyzing it we're arranging the infinte sum in different ways and thus its converging to different answers, neither of which is really valid. Been a long while since statistics but I think thats right =P

Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 29th, 2002, 2:41pm
How does drawing random numbers comes into this... the 2X sum is equally likely to be in either envelope, that's about it.

What's indefinite about the analysis? The expected value of switching is definitely 0.5X + 0.5(-X) = 0, not 0.25X. The payoff is definitely 1.5X whether you switch or not.

The riddle is just about finding the flaw in the reasoning given. Are you saying that you can analyze this in a way that shows the expected value of switching is not 0?

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 29th, 2002, 10:35pm
If X is drawn uniformly from the real numbers, or the positive powers of 2, or some other unbounded infinite set then the analysis listed in the problem is no more faulty than the analysis you use in your solution determing a payoff of 0. You've come up with alternative reasoning which gets a different answer, but you didn't actually justify why the problems reasoning is wrong other than the fact that it doesn't get the same answer as yours does.

You're attempting to define the payoff in terms of X, but the distribution of X is very naughty and is not amenable to typical statistical computations. For example, what is the mean value of X? Infinite/undefined. So when you say something like the payoff from switching is .5X - .5X = 0 what you're effectively saying is infinity - infinity = 0 but in truth infinty - infinity is not well-defined here.

If we put in a well behaved distribution from which to draw X (say a uniform random number from 1 -100), then the analysis stated in the problem breaks down because for a given X it is not equally likely that X/2 and 2X are in the other envelope and we do indeed get the expected payoff from switching equal to 0.

In short, the reasoning stated in the problem is "wrong" in the sense that it assumes something about the distribution of X (i.e. that given one envelope with some particular x in it, the odds of the other envelop having x/2 or 2x are even). This assumption, if true, causes the payoff of the game to not well-defined.

Title: Value of X is immaterial
Post by srowen on Jul 30th, 2002, 5:56am
The value of X is some constant and doesn't matter - for simplicity say that the envelopes contain $1 and $2. There is only one trial here, so there is no issue of the "distribution of X."

The statement that one has X and the other has 2X is simply a stated fact of the scenario. Nobody is randomly selecting values according to some distribution to go into these envelopes. Right?

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 9:49am
Ok. For some  reason I was solving a generalization of the problem  ::). Once we define a fixed pair (x,2x) then we can analyze just as you say, and the problem's logic breaks as I described: whether the opposite envelope is double your envelope is completely dependent on whats in your envelope. If we play the generalized game where we worry about how the game-setter chosses x, then he could be drawing it from a distribution that makes the reasoning of the problem correct, but if so he has to draw from a ugly distribution that makes the payoff outcome undefined (and is also rather hard to draw numbers from).

Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 30th, 2002, 10:40am

on 07/30/02 at 09:49:51, AlexH wrote:
If we play the generalized game where we worry about how the game-setter chosses x, then he could be drawing it from a distribution that makes the reasoning of the problem correct, but if so he has to draw from a ugly distribution that makes the payoff outcome undefined (and is also rather hard to draw numbers from).


Wouldn't the generalization be that we play a bunch of times, and each time, X is chosen according to some distribution, and he puts X and 2X into envelopes?

Instead of: X and Y are chosen from some distribution in such a way that one is twice the other - is that the right interpretation of what you're saying? I don't know how you would do that.

Regardless, either way, in this generalization, the expected value of switching envelopes remains 0. I don't see how it can be shown to be anything else.

The expected value of the game may be problematic, but the value of switching is not.

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 11:27am
I'm fine with your method of drawing some X from a distribution and letting the second envelope be 2X. Lets let that distribution have the property that for any particular x the probabilities of drawing x/2, x, and 2x are all equal (uniform on the positive real numbers is one such distribution).

If we're drawing numbers from this distribution, then what do you claim is wrong with the reasoning in the problem? You can come up with alternative, equally valid, reasoning that says that the answer should be different, but I'm asking you to point to where  the original reasoning goes wrong.

The reason two "correct" (i.e. correct except for sweeping the infinities under the rug) reasonings yield different answers is that the answer just isn't well defined.







Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 30th, 2002, 11:48am

on 07/30/02 at 11:27:20, AlexH wrote:
If we're drawing numbers from this distribution, then what do you claim is wrong with the reasoning in the problem? You can come up with alternative, equally valid, reasoning that says that the answer should be different, but I'm asking you to point to where  the original reasoning goes wrong.


Sorry, I still don't see what is not well-defined about any of this.

The original problem "computes" the expected value of switching as follows:

"You think to yourself that if your envelope has x dollars there is a 50% chance the other one has x/2 dollars and a 50% chance it has 2x dollars. The expected return, you compute, is .5[.5x + 2x]=1.25x which seems like a favorable gamble."

I claim that this is not valid, and should proceed like this:

"You think to yourself that there is a 50% chance that your envelope has x dollars and the other one has 2x dollars, and a 50% chance that your envelope has 2x dollars and the other one has x dollars. The expected return, you compute, is .5[x + 2x]=1.5x which is the same as the expected value without switching (1.5x)."

Or to be more succint: "50% of the time, switching loses me X dollars. 50% of the time it gains me X dollars. The expected value of switching is .5(-x) + .5(x) = 0."

The best I can do to explain intuitively what is wrong with the problem's analysis is to say that it is subtly comparing apples to oranges... meaning that yes, there is a 50% chance that the other envelope has twice as much (2x) as yours (x), and a 50% chance that it has half as much, but half of twice as much as we were talking about in the first case (2x)!

Now, I don't believe you are arguing that the expected value of switching is *not* 0. That is definitely wrong. But the problem is clearly concluding otherwise, that the expected value is positive. So, let me ask, what is right about the problem's analysis then!

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 12:20pm
Remember that we've chosen this special distribution. Let's look at the distribution conditional on knowing that our envelope contains the value X.

This reduces us to 2 possiblities, either the pair of numbers is X, 2X and we drew low, or the pair is X/2, X and we drew high. The pair X/2, X has the same probability of occuring as does the pair X, 2X (according to the rules of our distribution) and the fact that we got X tells us nothing about which case we're in because in both cases we have a 50/50 chance of selecting X as our envelope. So the case X,2X has the same odds as the case X,X/2. As in the problem statement this gives us expected payoff of X/4.

If you can't point to the flaw in the reasoning that gets the X/4 when you "know" that its really 0 that should indicate that something weird is going on.





Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 30th, 2002, 1:02pm

on 07/30/02 at 12:20:27, AlexH wrote:
This reduces us to 2 possiblities, either the pair of numbers is X, 2X and we drew low, or the pair is X/2, X and we drew high.

This is the heart of the problem with the reasoning: the possibilites are X,2X and 2X,X - not X,2X and X/2,X.

The game you are analyzing (and the problem is analyzing) would go like this:
You pick one of two envelopes. It has X dollars in it. I put, with equal probability, either 2X or X/2 dollars in the other. Should you switch?

The expected value of switching in *this* game is indeed 0.25X.

Our game is:
You pick one of two envelopes. There is a 50% chance it has X dollars in it (in which case there is a 100% chance that the other has 2X dollars). There is a 50% chance that it has 2X dollars in it (in which case there is a 100% chance that the other has X dollars). Should you switch?

These are crucially different situations... in our game, once you pick an envelope, there is in fact not a 50/50 chance that the other has half or double. In fact, the contents of the other is determined; it's either 2X or X, depending on whether you got X or 2X the first time.


on 07/30/02 at 12:20:27, AlexH wrote:
If you can't point to the flaw in the reasoning that gets the X/4 when you "know" that its really 0 that should indicate that something weird is going on.

Well, I think I've pointed out the flaw as clearly as is possible... not sure how much more I can say.

I understand that it's not enough to say "the argument is wrong because it doesn't produce the answer that I think is right," but in this case one can prove unequivocally that the expected outcome of switching is 0. All sorts of contradictions follow if it is not... it's like the proof that 2=1 problem.

So again, do you agree or disagree that the expected value of switching is 0? As in, should we get into this, or are you playing devil's advocate?

Any body else have an opinion?

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 2:27pm
Ok one last try. I'm not just playing devils advocate, I really do believe what I'm saying, but I am definitely not saying that 1/4X is ever the correct answer for the payoff.

Let A be the envelope you have and B be the other envelope. We chose some value Y from our distribution which has the property that for any value Z, p(Y=Z/2) = p(Y=Z) and we put the values Y and 2Y randomly into the envelopes A and B. If our distribution is well-behaved (in particular I want the probability densities to be well defined) then everything I'm about to do is legitimate to the best of my knowledge.

I'll label the number in envelope A to be X. X is either Y or 2Y but we haven't determined which. For any particular value V:

p(A=V,B=V/2) = .5 * p(Y=V/2)
p(A=V,B=2V) = .5 * p(Y=V)

However our choice of distribution tells us that
p(Y=V) = p(Y=V/2)
so:
p(A=V,B=V/2) = (A=V,B=2V).

This is true for any V, so lets pick V=X.
p(B= 2X|A=X)  = p(A=X,B=2X) / p(A=X)
= p(A=X,B=X/2) / p(A=X)
= p(B=X/2|A=X)

Since p(B=2X|A=X) = p(B=X/2|A=X) and we know A=X we now have:
p(B=2X) = p(B=X/2) = .5

The ONLY thing that I'm aware of that makes this reasoning fail is the fact that our distribution is not well behaved enough to apply probability densities. If there existed a well behaved distribution which still had the property that every pair of envelopes x/2,x was as likely as the pair x,2x then this reasoning would be perfectly correct in that domain and the payoff really would be X/4. Of course there is no such distribution and this is why the problem's logic must fail.


Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 30th, 2002, 3:16pm
Agreed, once more!


on 07/30/02 at 14:27:18, AlexH wrote:
I'll label the number in envelope A to be X. X is either Y or 2Y but we haven't determined which. For any particular value V:

p(A=V,B=V/2) = .5 * p(Y=V/2)
p(A=V,B=2V) = .5 * p(Y=V)

However our choice of distribution tells us that
p(Y=V) = p(Y=V/2)
so:
p(A=V,B=V/2) = (A=V,B=2V).


Agreed, I believe. You are saying that if the gamer is picking values randomly, values of $0.60 and $0.30 are just as likely as $0.60 and $1.20, and $0.40/$0.20 is as likely as $0.40/$0.80, etc., in fact any $X/$2X is equally likely.

But crucially, one set of values is chosen and fixed when the problem starts. So none of that matters. Who cares what A=X and B=2X are, or how they were chosen - they are fixed values now.

p(A=X,B=X/2) = (A=X,B=2X) only when A and B are random variables of the sort you describe. They are not - A and B are fixed and must be treated this way.

To illustrate I am going to say $1 for X, since X is as fixed as $1. What we really have is stuff like:

p(A=$2,B=$1) = 0.5
p(A=$1,B=$2) = 0.5
p(A=$2|B=$1) = 1
p(A=$1|B=$2) = 1
p(A=$0.50|B=$1) = 0

It is not true that p(A=X,B=X/2) = (A=X,B=2X) when A and B are fixed.


on 07/30/02 at 14:27:18, AlexH wrote:
The ONLY thing that I'm aware of that makes this reasoning fail is the fact that our distribution is not well behaved enough to apply probability densities. If there existed a well behaved distribution which still had the property that every pair of envelopes x/2,x was as likely as the pair x,2x then this reasoning would be perfectly correct in that domain and the payoff really would be X/4. Of course there is no such distribution and this is why the problem's logic must fail.


Yeah, I can agree that that's a way to say it... if given envelopes with $1 and $2, envelopes containing $1/$2 was somehow as probable as envelopes containing $1/$0.50, then yeah, I'm sure this and a lot of other false conclusions follow!

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 5:17pm

on 07/30/02 at 15:16:58, srowen wrote:
Agreed, once more!

But crucially, one set of values is chosen and fixed when the problem starts. So none of that matters. Who cares what A=X and B=2X are, or how they were chosen - they are fixed values now.

p(A=X,B=X/2) = (A=X,B=2X) only when A and B are random variables of the sort you describe. They are not - A and B are fixed and must be treated this way.


We're getting closer.  :D

A and B are not "fixed" when we're trying to compute the expected payoff from the switch strategy in the generalized game. There is nothing wrong with treating them as random variables and in fact your approach is also treating them as random variables, but you're conditioning on a certain Y,2Y pair of numbers while I'm conditioning on a certain value of A=X. Either of these approaches is valid, or would be on a well behaved distribution.

To put it another way, your "A and B are fixed" approach is breaking down the set of all possible occurances of the game into the subsets which have the property that A and B are drawn from a particular pair Y,2Y. The alternative of conditioning on A=X gives us the subsets where A happens to be X. As long as we both account for every case it won't matter how we choose to pair them up. Once again the caveat here: if the distribution is sufficiently ugly then there is essentially no answer to the questions we're asking because we can get lots of different answers.




Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 5:51pm
Whats going on here is actually very close to the following.

Consider the numbers  
1, -1, 1/2, -1/2, 1/3, -1/3, 1/4, -1/4, .....

What is the sum of this series?
If you choose to group it the obvious way it looks like the sum goes to 0. But I could group it as
1, -1, 1/2, 1/3, -1/2, 1/4, 1/5, 1/6, -1/3, ...
I'd still be hitting each number exactly once and this sum would head to infinity.

While the =0 way of grouping them is certainly asthetically appealing, but its not actually any more correct than the diverge to infnity way. The sum of the series is just not well defined (it doesn't converge absolutely).

Any distribution satisfying p(Y) = p(2Y) has this kind of messiness in spades and its just not sensible to talk about the payoff or expectations from such a game.  

Title: Re: HARD: ENVELOPE GAMBLE
Post by srowen on Jul 30th, 2002, 7:33pm
I think I get the gist of your argument, that if you start the other way and assume the analysis in the problem is correct, you find that the scenario it must be supposing cannot be reconciled with the riddle. Eh?

Mostly I think you are imagining harder problems here more worthy of your skills. If I'm misunderstanding then feel free to hit me one more time... I'll lay off and wait for a third unsuspecting person to weigh in.

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Jul 30th, 2002, 7:47pm
Thats about correct. The problem's logic makes certain assumptions about the distribution which if true force the distribution to be so wacky that things like payoff aren't defined.

You're completely right that this only applies to the generalization of the problem. In a single concrete case of 2 envelopes and 2 numbers then the "distribution" is simply a 100 percent chance of some particular pair Y,2Y and that certainly doesn't have the p(V) = p(2V) property which I needed to make the logic sound.

Enough time spent here ... on to the other puzzles! :) Well, maybe to sleep first but there is always tomorrow.

Title: Re: HARD: ENVELOPE GAMBLE
Post by JimP on Aug 2nd, 2002, 1:03pm
I'm intrigued as to why you guys think a probability distribution is applicable only to a repeated scenario.  Are you saying that multiple single gambles should yield a different result to one multiple gamble?

Title: Re: HARD: ENVELOPE GAMBLE
Post by AlexH on Aug 2nd, 2002, 4:02pm
If you consider the numbers as coming from a possibly ugly distribution then you can make things like expectation value undefined. The key thing isn't that you're doing multiple runs but that you're considering the distribution of the numbers. Whenever its well defined the expectation is 0 but you could pick a distribution for which its not well defined. The payoff of a single run if you consider a bad distribution can be undefined. Really I only brought the issue up because I misread the problem as being this more general case rather than the concrete case of 2 fixed numbers in the envelopes and then srowen and I got talking about it.

Title: Re: HARD: ENVELOPE GAMBLE
Post by James Fingas on Dec 20th, 2002, 10:02am
S.Owen,

Many months after reading your analysis, I have finally come to the conclusion that I don't agree.

If you've got X in your envelope, and it's 50/50 that the other envelope has X/2 or 2X, then you'd better switch! Your expected gain is very easy to compute, and the problem statement computed it correctly.

Let's suppose I were to try this trick a number of times, just to convince you. Each time, I would give you an envelope that contained X dollars. Each time, there'd be a 50/50 chance of the other envelope containing 2X or X/2 dollars. You can switch as many times as you want. Believe me, you will do better switching!

The exact problem with your argument is where you say:

0.5(2X-X) + 0.5(X-2X) = 0

This equation is correct, but doesn't apply to the situation at hand. In the first term, 0.5(2X-X), X is the amount in your envelope, and 2X is the amount in the other. In the second term, 0.5(X-2X), you re-definite X so that 2X is the amount in your envelope. You're using two different definitions of X in the same equation.

Now let me illustrate why Alex is bang on with the distribution argument. You know that when the host of the show picks the two amounts of cash, he is not picking uniformly on zero to infinity. You may say "that doesn't matter", but in fact it does. Think about it--the larger the amount you're given, the less likely it is that the other envelope has twice that much. That is exactly why the expectation is zero.

If you get $50, then the other envelope probably has $100. If you get $50 000, then the other envelope could have $100 000. But if you're offered $1 000 000, then it's very unlikely that the other envelope has $2 000 000, right? If you got $1 000 000 000 000, then the other envelope probably doesn't have $2 000 000 000 000. And these numbers are very small on the scale of zero to infinity! This is where the distribution the numbers are chosen from is really important in practical terms.

Title: Re: HARD: ENVELOPE GAMBLE
Post by S. Owen on Dec 20th, 2002, 2:20pm

on 12/20/02 at 10:02:27, James Fingas wrote:
Let's suppose I were to try this trick a number of times, just to convince you. Each time, I would give you an envelope that contained X dollars. Each time, there'd be a 50/50 chance of the other envelope containing 2X or X/2 dollars. You can switch as many times as you want. Believe me, you will do better switching!

The exact problem with your argument is where you say:

0.5(2X-X) + 0.5(X-2X) = 0

This equation is correct, but doesn't apply to the situation at hand. In the first term, 0.5(2X-X), X is the amount in your envelope, and 2X is the amount in the other. In the second term, 0.5(X-2X), you re-definite X so that 2X is the amount in your envelope. You're using two different definitions of X in the same equation.


I believe you are saying that the riddle is equivalent to this:

I give you an envelope with X dollars. I then show you another envelope, which is equally likely to have 2X or X/2 dollars. Do you switch?

Obviously you would switch in this case, and the point of the riddle is that it's trying to convince you that this is the same as:

I let you choose one of two envelopes - one has X dollars, the other 2X dollars. Then I ask if you'd like to switch to the other because of blah blah blah. Do you switch?

It's not though, of course - in the first scenario, switching clearly has a positive expected payoff, while in the second, it clearly has a 0 payoff.


I do see what you mean about distribution - generally, smaller values are more likely than larger values. So, on the whole, switching is probably slightly more likely to lose you money, right? I agree with this, given the reasonable assumption about the distribution of amounts in the envelopes. But the riddle is trying to convince you that switching is always profitable! I don't think you need any such assumption to show that *that* is not the case.


In any event I don't think that is the distribution argument that AlexH was making, though I confess that his argument is too subtle for my brain.

Rather than make the obvious case about why the riddle's argument is wrong, he advances a subtler argument, which probably makes fewer assumptions:

If the problem's reasoning is correct, and switching, astonishingly, always has a positive payoff, then there must be something impossible going on with the distribution of values in the envelopes... there is some contradiction waiting there.

This probably works too but it's heady stuff!


Title: Re: HARD: ENVELOPE GAMBLE
Post by fenomas on Dec 21st, 2002, 12:44am

on 12/20/02 at 10:02:27, James Fingas wrote:
S.Owen,

If you've got X in your envelope, and it's 50/50 that the other envelope has X/2 or 2X, then you'd better switch! Your expected gain is very easy to compute, and the problem statement computed it correctly.

Let's suppose I were to try this trick a number of times, just to convince you. Each time, I would give you an envelope that contained X dollars. Each time, there'd be a 50/50 chance of the other envelope containing 2X or X/2 dollars. You can switch as many times as you want. Believe me, you will do better switching!


James, with how similar this is to our dilemma in the littlewood thread, I think I can help show how you are off here, without involving distributions. In your version, you're given an envelope with X dollars, and you can switch to another than contains either 2X or X/2 dollars. However, this is very different from the actual situation, because you've been changing the definition of "X" in the middle of the problem.

The proper way to think of the problem is like this:
You are offered two envelopes, one containing X dollars, and the other containing 2X dollars. You choose one of them. Set Y = the value of the envelope you chose. Now, you have the option of switching to the other envelope, which contains either Y/2 or 2Y dollars (not X/2 or 2X!!).  But if you switch, you don't get Y/2 or 2Y with equal odds- the question of which you get is entirely determined by which envelope you chose before. If you chose the lesser envelope (Y=X), then switching gets you 2Y, or 2X dollars. If you chose the higher (Y=2X), switching now gets you Y/2, or X dollars. So before you chose, you had an expectation of 1/2(X+2X) dollars, and switching gets you an expectation of 1/2(X+2X) dollars.

Hopefully, this makes your fallacy clear. another way to say it is: You can't wind up with X/2 dollars because there was never an envelope containing X/2 dollars to begin with!

Title: Re: HARD: ENVELOPE GAMBLE
Post by fenomas on Dec 22nd, 2002, 7:52pm

on 12/21/02 at 00:44:08, fenomas wrote:
a bunch of stuff about redefining X


Hoo-ah, just noticed that my previous post just described in more detail what S.Owen said in the very first post!
Oh well, it's still right. ;)

Title: Re: HARD: ENVELOPE GAMBLE
Post by Kozo Morimoto on Dec 23rd, 2002, 7:59pm
Look at:
http://www.wilmott.com/310/today_detail.cfm?articleID=107

Not sure if you need to register, but here is the transcript:

I. Money, money, money
Two different amounts of money are placed into envelopes. One envelope is selected at random and given to you. The other envelope is given to Paul. Neither you nor Paul know the amounts. Paul offers you a bet, which you may take or leave. The bet is that after opening the envelopes whoever has the larger amount of money gives it to the other, leaving him with nothing.
Call the amounts of money in the envelopes X and Y with X>Y. There is a 50% chance your envelope contains X and a 50% chance it contains Y. If you turn down the bet, your expected winnings are (X + Y)/2 and your standard deviation is (X - Y)/2. If you accept the bet you have the same expectation, (X + Y)/2, but the standard deviation is now (X + Y)/2. [You should check these for yourself!] You turn down the bet because it leaves your expected value unchanged and increases your standard deviation.


Q: Is this the correct decision under normal economic utility theory assumptions?

Yes, but see below.

II. The envelope please. . .

You open your envelope and find $100. Paul asks you again if you want to bet. Now you reason that you have a 50% chance of winning or losing. If you lose, you lose $100. If you win, you win more than $100. So your expected value is positive and you take the bet.

Q: Is it correct that your expected value is positive?

Yes, but see below.

Q: Is Paul’s expected value also positive for the same bet?

Yes, but see below.

Q: If so, where does the extra value come from?

This question hurls us into the contentious area of philosophy of probability. The first three questions are answered by using simple tools in standard ways. Now we have to think about why they give apparently inconsistent answers.

There are people who call themselves Bayesians (because they use Bayes Rule a lot, not because Nonconformist Reverend Thomas Bayes was one) who demand consistency in all things. They say probability refers to subjective belief. Observation updates our prior beliefs, and any complete probability calculation must specify a prior distribution, what you believed before making the observation. To a Bayesian your failure to update your probability of winning after opening your envelope implies that your prior distribution of amount of money in the envelopes had infinite expected value. You take the bet because however much money is in your envelope, it is less than the (infinite) expected value of the other envelope. There is no extra value, any more than there is anything left over when you match up all integers with all even integers. Infinity works that way.

Bayesians are a minority, and the majority of statisticians feel no need to name themselves. Bayesians call them “frequentists” or “objectivists.” These people do not demand consistency. To them, the additional value arises from an inconsistency in your probability model. All standard statistical methods have these inconsistencies and you don’t worry about them unless you want people to call you a “closet Bayesian.”

Q: And why would you take a bet after you open the envelope, but not before?

Before you open the envelope your probability of winning is clearly 50%. Once you open it, the probability becomes model-dependent. Models do not always lead to paradox, but they always create the possibility of paradox.

Q: Does it matter if Paul sees how much you had in your envelope?

Now we have a second-order model. We have to figure out what it means if Paul takes the bet. We want to know who has to accept first, or if there is some way for both of us to do it at once. We get inconsistencies even without that nonsense.

Title: Re: HARD: ENVELOPE GAMBLE
Post by fenomas on Dec 24th, 2002, 1:03am
Wow, thanks for the info, Kozo.
Now what can that tell us about Littlewood's number game??  ???

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Dec 24th, 2002, 1:10am

on 12/23/02 at 19:59:05, Kozo Morimoto wrote:
Q: Is Paul’s expected value also positive for the same bet?

Yes, but see below.

if paul hasn't opened his envelope and knows you have 100 he could have 50 or 200. So his expected value is 125. If he wins he has 50, and gets 100, so that's 150, if he looses he has 200, and looses it. So after the bet his expected value is 75. His expected gain is -50. Not at all positive..

Perhaps it's a matter of perspective, who expects what for whom..

Title: Re: HARD: ENVELOPE GAMBLE
Post by S. Owen on Dec 24th, 2002, 8:01am
After thinking for a while, I think I get this one. I'm not sure I fully appreciate the subtlety, but here's a first attempt.

The hidden twist here is that we're sort of saying that X and Y are randomly chosen positive numbers, chosen from (0, infinity). Keeping that in mind, the contradictions go away, in the sense that one realizes that we're talking about infinities.

The expected value for each of them before the bet is really P(X is in envelope)*E(X) + P(Y is in envelope)*E(Y) = 1/2(E(X) + E(Y)) - but this is infinity since the expected value of each is infinite, given what we know.

If it's then revealed that you have 100 (and Paul has Z, where Z is the other value; it's positive and could be more or less than 100), then your expected gain from the bet is P(Z < 100)*(-100) + P(Z > 100)*E(Z). But this is again infinity, since P(Z < 100) is infinitesimal and P(Z > 100) and E(Z) are infinite.

So, that I think solves the mystery of why just knowing how much money you have suddenly makes the bet attractive, no matter how much the value actually is.

Now as for Paul though... I can only assume that the post is talking about the original bet or something, because he has "everything to lose" - following the previous reasoning, his expected value of switching is infinitely negative, since his expected value before knowing how much he has is infinite.

Title: Re: HARD: ENVELOPE GAMBLE
Post by BNC on Dec 25th, 2002, 9:40am
OK, I hope I won't repeat too much of what was already posted.

I heard this one long time ago, and thaat ver was in two stages:

1. You open the envelope and find $100. Thus, the other envelope contains either $50 or $200 (note that you do not know in advance what is X, so the argument that you have in your envelope either X or 2X is invalid -- you always, by definition, got X). So, you should switch. No wonder there.

2. Now, you don't open the evelope. You figure: since X is unbounded, no matter what value I would find in the envelope, I would still choose to trade. So, you trade. But then, given the option to trade back -- you wil again, for the same reasoning!

Well, I don't know if it's true, but from what I heard back then, this is an unsolved paradox. There are attempts to bypass this paradox by limiting the higher amount by "the total of money in the world = H" (so, if you get > H/2, so don't switch) and the lower amount by "the lowest currency unit in the world - L" (so, if you get L you switch).
IMHO, these are unsatisfactory solutions. And, as I said,  don't think an actual solution exists. I will be happy to learn otherwise, though.


See ya' around,
BNC

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Dec 25th, 2002, 12:15pm

on 12/25/02 at 09:40:18, BNC wrote:
2. Now, you don't open the evelope. You figure: since X is unbounded, no matter what value I would find in the envelope, I would still choose to trade. So, you trade. But then, given the option to trade back -- you wil again, for the same reasoning!
You may not know what X is, but it is a set value..
The envelope you have has expected value X, you trade because the expected value of the other is (0.5X+2X)/2 = 1.25X.
Given the chance to trade back you don't, because the expected values don't change till your information changes, so your current envelope keeps expected value 1.25 X



Title: Re: HARD: ENVELOPE GAMBLE
Post by BNC on Dec 25th, 2002, 11:44pm
Yeah, you don't have new information. None, whatsoever. Including the value of X (it is set, yes, but still unknown).

Lets define your new evelope as having value Y (=1.25X, but X is artitrary, so Y is arbitrary as well).
The value of the first envelope, although X originally, may now be represented as 1.25Y (and not 0.8Y) -- I think. I'll be happy to hear an explaination as to why the other value is not 1.25Y -- in terms of Y, not X.


Title: Re: HARD: ENVELOPE GAMBLE
Post by GRAND_ADMRL_THUORN on Feb 12th, 2003, 12:34pm
ive read every entry in this thread and like always you guys break it down incredible well, anyway after reading everything and some thinking on my own, i have to conclude what i thought when i first read this riddle, changing is irrelavent

Title: Re: HARD: ENVELOPE GAMBLE
Post by James Fingas on Feb 12th, 2003, 2:51pm
Grand admiral Thuorn,

I definitely do not agree with you. Paradoxically, you do expect to gain money by switching. Let me remodel the question like this:

1) I pick Y uniformly between 0 and 100 dollars.

2) Now I fill two envelopes, one with Y dollars, and one with 2Y dollars. I then tell you that Y is between 0 and 100.

3) Now open one of the envelopes. Here's where it gets tricky:

4) As per this question, suppose that you open the envelope and discover less than 100 dollars. Computing your payoff, you expect to win money by switching. But how can that be?

5) The catch is that to get a positive expectation, you must find an amount less than $100. If you find an amount larger than $100, you expect (in fact, you are guaranteed) to lose money.

6) Now, I choose Y between 0 and 1000, or 0 and 1 000 000, or 0 and Z (a finite number). The expectation for switching is always positive if X is smaller than Z, the maximum possible value for Y. Of course, 50% of the time, X will be larger than Z, but that's another matter.

7) Here's where the trick is played. You state that Y is chosen between zero and infinity. Then, you say that you find X dollars in the envelope, implying that X is finite. If X is finite, then it must be smaller than Z, which is infinite. Therefore, you have a positive expectation.

8) The problem is that there is absolutely zero chance of finding a finite number in the envelope if Z is infinite. It's not just unlikely that you'll get a finite number, it cannot happen.

Title: Re: HARD: ENVELOPE GAMBLE
Post by TimMann on Feb 13th, 2003, 12:46am
Picking some nits:


on 02/12/03 at 14:51:30, James Fingas wrote:
Now, I choose Y between 0 and 1000, or 0 and 1 000 000, or 0 and Z (a finite number). The expectation for switching is always positive if X is smaller than Z, the maximum possible value for Y. Of course, 50% of the time, X will be larger than Z, but that's another matter.

Don't you mean 25% of the time? In the case where X = 2Y (50% of the time), it's still not necessarily true that X > Z, as in 50% of those cases, Y was in the low half of its range.


Quote:
The problem is that there is absolutely zero chance of finding a finite number in the envelope if Z is infinite. It's not just unlikely that you'll get a finite number, it cannot happen.

I think you meant to say "The problem is that there is absolutely zero chance of finding an infinite number in the envelope if Z is infinite. It's not just unlikely that you'll get an infinite number, it cannot happen."

If that's what you meant, I follow your reasoning. Let's take it a bit further to be more explicit:

9) So the fact that you found a finite number (i.e., a number less than Z) tells you nothing. Unlike in the finite case, the fact that the number you saw was less than Z no longer tells you that this is a good time to switch. Also, it's no longer true that in 25% of the cases you'll find a number larger than Z in the first envelope you open.

Title: Re: HARD: ENVELOPE GAMBLE
Post by James Fingas on Feb 17th, 2003, 12:15pm
Tim,

Good point on the 50%/25% thing.

However, in terms of the finite/infinite thing, I actually meant what I said.

Consider the cdf of the uniform random distribution over all the reals. That cdf is identically zero. Therefore, picking any finite value R, it is absolutely impossible that we can find a number less than R in the envelope (just as impossible as getting -54 when picking a random number from 0 to 10).

So when we do find a finite number, we know that the envelope-stuffer didn't use the uniform distribution over all the reals.

Another corrolary of this is that it's actually impossible to choose a number uniformly at random from the set of all real numbers, but that's not important for this question.

Title: Re: HARD: ENVELOPE GAMBLE
Post by TimMann on Feb 20th, 2003, 2:28am
By "it's impossible to choose a number uniformly at random from the set of all reals," do you mean to say that "the uniform random distribution over all the reals" doesn't exist? That is, by assuming that it exists, you've reasoned your way to a contradiction, and therefore you have reduced the idea of a uniform random distribution over all the reals to absurdity. That may be correct (I'm not quite certain), but your previous post with the 8 steps didn't really explain the reasoning in steps 7-8, so it was not clear that's what you were doing.

Backing up a bit, suppose that we naively assume there is such a thing as a uniform random distribution over all the reals. Then it's clear that if we choose a number from this distribution, it will be finite, because all the reals are finite. It's also clear that the probability of getting any particular real number must be 0, but of course that does not mean that it's impossible.

The difficulty is that no function exists that is either the density function or the cumulative distribution function of this distribution. Certainly f(x) = 0 is not the cdf (despite what you said), because f(x) does not approach 1 as x goes to infinity, so it is not a cumulative distribution function at all. Equally, f(x) = 0 is not the density function, because the integral of f(x) from -oo to oo is 0, not 1. Neither is any other function, as the function must be constant to be uniform, but the integral of any other constant function from -oo to oo diverges.

That suggests that indeed there is no such distribution. However, I don't claim to know enough about the foundations of probability theory to say whether that's a valid conclusion or not. It wouldn't seem to make much practical difference, as a distribution for which you can't define either a density function or a cdf doesn't seem useful for much other than as a pathological case.

It certainly can't be true that the distribution exists, but that you get an infinite number, a blank piece of paper, or a rhinoceros when you choose from it, as that would be a contradiction too.


Title: Re: HARD: ENVELOPE GAMBLE
Post by James Fingas on Feb 20th, 2003, 12:41pm
Tim,

That's a good question. To be honest, I don't know whether the uniform distribution over all the reals "exists" or "does not exist". It's certainly not a well-behaved distribution, and that's why the cdf is zero for all finite real numbers. You could argue that the cdf still goes to 1 near infinity, but that's getting a little too fuzzy for me.

Whether or not the distribution exists seems to me to be more of a philosophical problem than a mathematical one. The question implies that such a distribution does exist and that you can pick a finite number out of it, and so I assumed that it does exist.

You are correct in saying that a zero chance of picking a number doesn't necessarily mean it's impossible. For instance, when you choose a random number from 0 to 1, there's zero chance of getting any specific number in the distribution, but you always do get one specific number. However, in the case of the unbounded uniform distribution, we are talking about a whole different kettle of fish. The chance of getting a specific finite number is zero, but the chance of getting a number within any finite range is also zero.

I think the best way to avoid this conundrum is just to assume that the envelope-stuffer is using a different distribution. Although the question doesn't tell you what that distribution is, it can't be the unbounded uniform distribution because the envelope-stuffer has no good way to choose a number from that distribution. The properties of such a distribution, and whether or not it exists, can be removed from the scope of the problem.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Icarus on Feb 20th, 2003, 4:28pm
Does a uniform distribution over all the reals exist? No.
Being uniform, the probability density is constant. But the integral of any non-zero constant over all the Reals is infinite, while the integral of zero is of course zero. In neither case can you get the required integral of 1.

Title: Re: HARD: ENVELOPE GAMBLE
Post by James Fingas on Feb 24th, 2003, 10:05am
Icarus,

I agree with you, but it seems to me that the problem is essentially one of limits.

Certainly no pre-defined "uniform distribution over the reals" can exist, but when we look at it as the limit of the uniform distributions over [0,Z), then we can still have properties which hold as we take the limit. For instance, half of the values you choose from this distribution will be less than Z/2. These properties are what allow the epectation to remain zero for all finite uniform distributions.

I think the expectation still exists as long as we are taking the limit, but if we start from the "uniform distribution over the reals", then we have nothing to work with. It's just like trying to find the value of 0/0. Depending on how you got to 0/0, you can find the value in different ways (eg using L'Hospital's rule). But if somebody just gives you "0/0", you can't work backwards.

In the same way, working from the finite uniform distributions, we expect a certain payoff for switching envelopes, which we can take the limit of. Working from the other direction, we get nonsense. It would be nice if we could do this:

0.75*integral( x=0, x=inf/2, 1.25X ) + 0.25*integral( x=inf/2, x=inf, 0.5X )

However, mathematics can't get an answer out of this any more than it can get an answer out of 0/0.

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on May 31st, 2004, 10:10am
Over a year since last post on this riddle, but I personally don't think that the real answer to this riddle has yet been posted, so here goes. (Hope I don't inflame anyone.)

First of all, I hope that everyone agrees with S.Owen's very first post which is the correct way to analyze the question as to whether or not you should switch. The expected gain from switching is zero. If anyone truely believes that the the expected gain (by switching) is actually 0.25x then I would be more than happy to write a Visual Basic program that simulates the game, and play it with you, where you agree to pay 1.125x (where x is the result of your first opened envelope) each time and I pay you whatever is in the second envelope (which has already been determined before you picked your envelope), and we run the number of trials at around 1 million or so. (If the expected gain from changing is truely .25x, then you will gain in the long term by 1.25x - 1.125x = .125x, and if it is 0, obviously you will lose in the long term 1.125x - 1x = .125x)

The reason I believe the riddle is still unsolved, however, is that no one has been able to point out the actual falacy of the reasoning stated in the riddle that ultimately leads to the (incorrect) assertion that the expected return is 1.25x

The error in the logic of the riddle is more fundamental than all this talk about "random distribution" and "infinity". If anyone cares anymore, please post here and I'll provide the correct answer to why the logic in the riddle is not correct, but consider this my hint to the solution of the riddle.

(I can't imagine why people I know in real life have called me narcissistic...)

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on May 31st, 2004, 11:12am
I'm always willing to hear new thoughts on classic paradoxes like this one.

I'm sure other people are also curious.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Grimbal on May 31st, 2004, 12:51pm
I think it is a distribution problem.

Whatever the method you use to choose the amounts to put in the envelopes, the probability of (x,2x) being in the envelopes must tend to 0 when x goes to infinity.  If you choose real numbers, the density function must tend to zero.

So, at least for some values, the probability of (x,2x) is lower than the probability of (x/2,x).  For instance, in the case where there is a maximum, M, then if you see x>M/2, you certainly don't switch.  The expecatation would by x*0.5.  If not, you are twice as likely to have choosen the low number.  Switching gives you an expectation of x*1.5.
If you don't know the number, either you have a high number in range M/2..M averaging 3/4*M and swiching brings you down to 3/8*M, or you have a low number in range [0..M/2], averaging M/4, and switching brings you 1.5*M/4 = 3/8*M.  Either way is the same.

If the set of values is discreet, obviously, there are some values that can not been divided.  If you see an amount like $6.11, and there is no half-cent, it must be the low value.  These are cases where (x/2,x) is actually less likely than (x,2x).

If you use continuous  values, and you choose the low value with density f(x) and the high value as 2x, to say that (x/2,x) and (x,2x) is as likely, is like to say that f(x)=2*f(2x).  One solution of this is f(x) = c*1/x.  Obviously, it can not be integrated.  f(x) must decrease faster than that.

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on May 31st, 2004, 1:27pm
After you've opened the first envelop
you've either got x, and by switching go to 2x, a gain of x.
Or you have 2x, and by switching go to x, a 'gain' of -x
(x+-x)/2 = 0 no mystery..

Total expected gain from the start remains 1.5 x

Title: Re: HARD: ENVELOPE GAMBLE
Post by Grimbal on May 31st, 2004, 4:30pm
But suppose you open the envelope, it is $42.  You can either win $42 or loose $21, right?  That is +$10.5, right?
In fact, after you open the envelope, whatever the value, you should switch, right?
So the best thing to do is open and switch.  Whatever the value.  No?
So you know that you are going to switch anyway, right?
Why not switch immediately?
If you switch immediately, before opening, do you want to switch again?

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on May 31st, 2004, 9:00pm

on 05/31/04 at 13:27:42, towr wrote:
After you've opened the first envelop
you've either got x, and by switching go to 2x, a gain of x.
Or you have 2x, and by switching go to x, a 'gain' of -x
(x+-x)/2 = 0 no mystery...

Like I said before, I think everyone agrees that this way (which is how S.Owen explained it in the first post of this thread) is the correct way to analyze the problem. However, it still does not explain why the reasoning in the riddle is not correct.

Now I'm going to try to explain the falacy in the logic of the riddle which results in the expected return of 1.25X. I don't have an easy way to explain in words what I am thinking, probably because I lack a strong mathematical background, so I'll try to explain the best I can, and someone can put the final result into better words.

First of all, the rules stated that the envelopes will have a non-zero sum of money, meaning no negative values for any envelope. This should be obvious, but I have to state it outright. Then...

Once you've picked your first envelope, if you call the monetary amount you have in there X, you cannot say that there is a 50% chance that you are looking at X of the (X/2,X) possibility versus a 50% chance that you are looking at the X of the (X,2X) possibility. The reason you cannot do this is: then what you are stating is that you have the higher of the 2 numbers if you have the (X/2,X) possibility, and the lower of the 2 numbers if you have the (X,2X) possibility. You are now looking at only a subset of the (X/2,X) versus (X,2X) set of possibilities. You are basically stating that "I'll know X when I see it" before you have picked your envelope.

Basically, by defining X as "whatever amount is in the envelope I picked", you are playing the game where you will always have the highest of the (x/2,X) possibility or the lowest of the (X,2X) possibility. You have now defined a new game.

You are now calculating the expected payoff for an entirely different game, namely the game where you pick an envelope, but it is guaranteed to be either the higher-valued of the two envelopes if you have the (X/2,X) possibility, and the lower-valued of the 2 envelopes if you have the (X,2X) possibility. In this (different) game, it would make sense to always switch. But this different game has no relevance to how the actual game is being played.

After re-reading everything I just wrote just now, I think that many will still disagree with me. Hopefully at least someone smarter than me will understand what I'm trying to say and restate it better.

Title: Re: HARD: ENVELOPE GAMBLE
Post by THUDandBLUNDER on May 31st, 2004, 9:21pm

Quote:
However, it still does not explain why the reasoning in the riddle is not correct.

As with the d(x2)/dx = x fallacy, I think it is case of confusing constants with variables.


Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Jun 1st, 2004, 12:28am
I agree,
you either have X from the pair (X, 2X) or Y from the pair (Y/2, Y).
If you think Y=X, then you'd expect 1/4 X gain
But (X, 2X) = (Y/2, Y), because it's the same pair of envelopes. So Y=2X and expected gain is 0

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Jun 1st, 2004, 6:14am
Having been traumatized by physics courses, I regard the envelope I pick as being in a quantum superposition of X and 2X so having a value of 1.5X. The other envelope has the same value, so there's no point switching.

Of course, once you open an envelope, you have additional data on the actual value of X, which, combined with information on the distribution of X, may affect your decision to switch...

Title: Re: HARD: ENVELOPE GAMBLE
Post by asterix on Jun 1st, 2004, 7:30am
Let's try changing the rules to get rid of the infinity confusion.
I'll write 1 and 2 on one piece of paper and 2 and 4 on another and put them in the envelopes. You pick an envelope, and I'll open it and tell you that there's a 2 on it. Now you can either keep the 2 or trade it for whichever number in the other envelope is not a 2.
Well, 1 and 4 have an expected return of 2.5, so you're better off switching. Knowing that, and knowing that my opening of the first envelope will not give you any additional information (since I'll always find a 2), Even before I open the first envelope, you're better off switching because you don't want the 2, you want the 1 or 4. But there would be no reason to switch back unless I change the rules and say you're going to get the 2 in this envelope instead and it's the first envelope that has the variable payout.
The reason the riddle has an expected payout of 1.25 is because the uncertain envelope is worth more than the certain one. But in the original riddle, the certain one becomes certain only after you open it and see a fixed amount. There is no point in switching before you see that amount because, unlike my variation, what you find in one envelope will not change what's in the other; it only changes how much information you have about what's there. (And information changes odds even when it doesn't change reality)
I hope this makes some sense.

Title: Re: HARD: ENVELOPE GAMBLE
Post by asterix on Jun 1st, 2004, 7:49am
Of course, the other way to avoid having to find a random distribution over infinity is to say the numbers aren't random. I deliberately pick the values in each envelope. Then, you know I wouldn't put an odd number in either envelope because when you saw it you'd know the other envelope has to have 2x. Therefore, I'd never pick a number which divided by 2 would be odd, because if you picked that one you'd know the other envelope certainly doesn't contain an odd number, so it must have 2x. Therefore I'd never pick a number which divided by 4 would be odd...
So the only logical choice for me would be to put Infinity in one envelope and 2xInfinity in the other. And the payout is the same, so there's no reason to switch.

Title: Re: HARD: ENVELOPE GAMBLE
Post by BNC on Jun 1st, 2004, 7:54am

on 06/01/04 at 07:49:49, asterix wrote:
Of course, the other way to avoid having to find a random distribution over infinity is to say the numbers aren't random. I deliberately pick the values in each envelope. Then, you know I wouldn't put an odd number in either envelope because when you saw it you'd know the other envelope has to have 2x. Therefore, I'd never pick a number which divided by 2 would be odd, because if you picked that one you'd know the other envelope certainly doesn't contain an odd number, so it must have 2x. Therefore I'd never pick a number which divided by 4 would be odd...
So the only logical choice for me would be to put Infinity in one envelope and 2xInfinity in the other. And the payout is the same, so there's no reason to switch.


Yeah... and the exam could never happen  :P

Title: Re: HARD: ENVELOPE GAMBLE
Post by Grimbal on Jun 1st, 2004, 4:15pm
Let's be practical.

There are some reasons, whether valid or not, that tell us that we should switch.  There are some other reasons, valid or not, that tell us it doesn't matter.  So, just in case, we should switch.  Then get the money.

But in my opinion, there is no distribution such that for every X, P(X=high|picked=X) = P(X=low|picked=X), which is implied by the problem.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Jun 2nd, 2004, 6:38am

on 06/01/04 at 16:15:30, Grimbal wrote:
Let's be practical.

There are some reasons, whether valid or not, that tell us that we should switch.  There are some other reasons, valid or not, that tell us it doesn't matter.  So, just in case, we should switch.  Then get the money.

Very practical. As long as no-one comes up with a reason, whether valid or not, to say we lose by switching...

As for the variation with 1-2 and 2-4, take the 2 or switch, it isn't obviously equivalent to the original problem because in the original, the amounts in the two envelopes is determined first, then you pick an envelope. In the variation, you pick an envelope, and that determines the value.

Obviously, for any finite set of possible values for X, where the envelopes contain values X and 2X, the expected return for always switching without seeing the contents of the envelopes (or equivalently with no knowledge of the set of values X can take) must be the same as that for always keeping your first pick. Looking at a uniform distribution for X over the first n powers of 2, (2,4,...,2n) the expected return for never switching is 3(2n-1)/n, while the expected return from always switching after seeing the contents of the envelope except when opening 2n is (7*2n-6)/2n - a gain of 2n-1/n. On the other hand, I'm sure Icarus would have harsh words for me if I then tried taking the limit as n goes to infinity :) - after all, a uniform distribution across an infinite number of possibilities has probability 0 of any given event, which means you're looking for some sort of integral rather than a summation, and the profit lies in the strange effects for the maximum value, which means the value of the integral is determined by a discontinuity, which gets even harder to cope with, and oh no, Three Hands has gone cross-eyed...

Title: Re: HARD: ENVELOPE GAMBLE
Post by Three Hands on Jun 2nd, 2004, 10:38am

on 06/02/04 at 06:38:09, rmsgrey wrote:
after all, a uniform distribution across an infinite number of possibilities has probability 0 of any given event, which means you're looking for some sort of integral rather than a summation, and the profit lies in the strange effects for the maximum value, which means the value of the integral is determined by a discontinuity, which gets even harder to cope with, and oh no, Three Hands has gone cross-eyed...


Dammit - and done in such a way that I was struggling to keep them uncrossed to begin with  >:(

After all, I'm a philosopher, not a mathemawhatsit... ::)

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 3rd, 2004, 9:42pm
First of all, I do not think this is a distribution problem. The fact of the matter is that, in the riddle, the gamemaster has already chosen the x and 2x that go into each envelope, and it does not state this was done randomly.

The problem with calculating the expected payoff using the rationale stated in the riddle is as follows:

The envelopes are already set when the game begins. If you choose an envelope and call it's value X, then when you calculate the expected payoff by switching, you are calculating the average amount you will expect per game if: you are given these same 2 envelopes already chosen, and you are handed the same envelope each time which is the one you have now called X, and you switch to the other each time.

The problem is you are now saying that there is a P(A) = chance you are playing the (x/2,x) game, and P(B) = chance you are playing the (x,2x) game, and that P(A) = P(b) = 50%.

Therefore, payoff = P(A)(x/2) + P(B)(2x) = 1.25x

But this is not correct. P(A) does not equal P(B), in fact, since the envelopes are already set, either P(A) = 1 or P(B) = 1, you just don't know which. Just because you don't know, does not mean you call them 50% chance each. You have already playing either the A game or the B game, you just don't know which it is.

So actually the payoff = either x/2 or 2x, but you don't know which, when choosing to do the calculations by calling your envelope X.

Please consider this analogy: I have a coin, I flip it, then hide the results from you. Now I offer the following: if it is heads, you win $2, if it is tails, you win nothing. If the orientation of the coin never changes, the expected payoff is either $0 each time you play, or $2 each time you play. If I charged you $1 to play the game forever, you would either win an infinite amount of money, or lose an infinite amount of money. However, this is different from the game where, each time you play, I flip the coin anew, in which case the expected payoff is $1, and if you play an infinite number of times, you would in the long run break even having paid $1 per game.

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 3rd, 2004, 9:51pm
I must also note that what the riddle is asking you to do is calculate the expected payoff this scenario.

Suppose the gamemaster has a set of an infinite number of  envelopes laying on the table, and for any envelope x, he always remembers which other envelope is the corresponding "2x" envelope. He always picks 2 envelopes such that one contains twice the other.

On the first round, he hands you 2 envelopes. Before opening them, you label one envelope as "X" literally, with a marker or pen and give both envelopes back. You then agree to play the game an infinite number of times.

Each time he hands you envelopes and neither one is marked with your X, you reject playing the game and ask for a new pair. He is picking the pairs of envelope at random. For now, assume he can do this.

However, when you get 2 envelopes, and one has an "X" on it, you agree to play the game, and now you always switch to the other envelope!

In this version of the game, the chance that you are now playing the (x/2,x) versus the (x,2x) game is equal and set to 50%, and the expected payoff is as calculated in the riddle.

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Jun 4th, 2004, 3:37am

on 06/03/04 at 21:42:00, pedronunezmd wrote:
But this is not correct. P(A) does not equal P(B)
Depends on how you chose which envelope you pick, if you label the envolopes A and B, then flip a coin and choise envelope A if it's heads and B if it's tails, then P(A) = P(B).
Choosing the 2x envelope is equally likely as choosing the x envelope.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Grimbal on Jun 4th, 2004, 8:23am

on 06/03/04 at 21:42:00, pedronunezmd wrote:
First of all, I do not think this is a distribution problem. The fact of the matter is that, in the riddle, the gamemaster has already chosen the x and 2x that go into each envelope, and it does not state this was done randomly.


But you don't know the numbers.  So you have to consider the game master as a random process.  If you don't, you can not analyze the problem probabilistically.

Suppose the game master gives you one envelope with some amount int it, what is the probability that it is over $1?

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 4th, 2004, 10:03am

Quote:
Depends on how you chose which envelope you pick, if you label the envolopes A and B, then flip a coin and choise envelope A if it's heads and B if it's tails, then P(A) = P(B).
Choosing the 2x envelope is equally likely as choosing the x envelope.

Actually, I defined "A" as "you are playing the (x/2,x) game" and "B" as "you are playing the (x,2x) game", not as labels for each of the two envelopes. Whichever envelope you choose, you are stating the amount inside it is "x" per the riddles suggestion. You know that if the envelope you have has contains x dollars, the other envelope has either x/2 dollars, if it turns out you are playing the A game, and 2x dollars, if it turns out you are playing the B game. But the P(you are playing the "A" game) = P(A) is either 1 or 0, because the envelopes have already been set. It is incorrect, given the way envelopes' amounts have already been set, to state that P(A) = .5

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Jun 4th, 2004, 2:14pm
But which game you're playing isn't determined in advance - you determine which game you're playing by your choice of envelope. The contents of the envelopes are set as, say, M and N with N=2M, but which you label as X depends on which you pick (with even chance each way) so you have an equal chance of being in the X,2X game or the X,X/2 game. The catch is that X in the first case is M, and in the second N, so the latter is worth twice the former.

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 4th, 2004, 4:49pm

Quote:
But which game you're playing isn't determined in advance - you determine which game you're playing by your choice of envelope.

I'm not saying that which game you're playing is determined in advance. However, if you choose to determine the expected payoff of the game, once you have the 2 envelopes in front of you, by calling one "x", then "which game you are playing" is now fixed, as far as for the calculations of your expected payoff are concerned. You have specified and labeled the one envelope as "x", and you have either picked the x such that you are playing the A game or the B game, for purposes of calculating the expected payoff. One of the two games is now "the actual game" and the other is irrelevant. You don't know which, but you can't now assume that P(A) = P(B) at this point in the calculations, which is why the reasoning stated in the riddle is wrong.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Jun 5th, 2004, 7:14am
But I have no information about whether I'm playing game A or game B. Assuming that my choice of which envelope to label 'X' is independent of which has more money, then if I played the game a million times, about 500,000 of them would be game A and the remainder game B.

At the point where you face the choice of switching or not, then you are in one of two universes, one where P(A)=1, and one where P(B)=1, but you have no way of knowing which universe you are in, so, from the information you have, P(P(A)=1,P(B)=0) = P(P(A)=0,P(B)=1) = 0.5, in other words, P(A)=P(B)=0.5


Quote:
Please consider this analogy: I have a coin, I flip it, then hide the results from you. Now I offer the following: if it is heads, you win $2, if it is tails, you win nothing. If the orientation of the coin never changes, the expected payoff is either $0 each time you play, or $2 each time you play. If I charged you $1 to play the game forever, you would either win an infinite amount of money, or lose an infinite amount of money. However, this is different from the game where, each time you play, I flip the coin anew, in which case the expected payoff is $1, and if you play an infinite number of times, you would in the long run break even having paid $1 per game.


While the range of possible outcomes is different between the two coin toss games, in both of them, the expected return before the coin is revealed is 0. After the first round, the expected return per round for the first game is fixed at one of +1 or -1, but until you know which, the two are equally likely to be the case, so the expected return is still 0.

Title: Re: HARD: ENVELOPE GAMBLE
Post by asterix on Jun 5th, 2004, 11:40am
Suppose the two numbers in the envelope were 2 and 4, and you played the game a million times, but with short term memory loss, so you never figured out that the numbers never changed. Half the time you'd draw the 2, figure the other envelope must be 1 or 4, so you'd switch, and always win. The rest of the time you'd draw the 4, figure the other contains either 2 or 8, so you'd switch, and always lose. Final total is simply the average of 2 and 4, and there is no advantage whatsoever to switching.
Now go back to the original game where the numbers are random from an infinite distribution, and play the game an infinite number of times. There will be an infinite number of games where there is a 2 and a 4. And you'll draw each an equal number of times, and have the same results as I described, a simple average with no advantage to switching. There will also be an infinite number of times when you'll get every other possible pair, and in every case there will be no advantage to switching (unless you get a number that cannot be divided by 2 and you can determine logically that it's the low number).
So instead of looking at a 4 and saying, the other number is either a 2 or an 8; the average is 5, so I should switch. You should look at that 4 and say, "half the time this 4 is part of a 2,4 pair, and I'll always switch, so I'll average 3 and gain no advantage; the rest of the time this 4 is part of a 4,8 pair, and I'll always switch so I'll average 6 and gain no advantage."
In the long run there's no advantage to switching. But in the short run (playing the game just once) the average is not possible; you either gain or lose. And since this short run does not allow you to take into consideration the times when you draw 1/2 the actual number and win a little by switching or the times when you draw 2x the actual number and lose a lot by switching, a single game is going to give the impression of gaining by the switch (and I'd still switch).

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 6th, 2004, 7:38am
In response to Rmsgrey's recent post:

Let me address the coin flip game first. There is a fundamental difference between calculating the expected outcome before the coin has been flipped and after the coin has been flipped.

Prior to the coin being flipped, the P(head) = P(tails) = .5 and you can use that in calculating the expected payoff. If you play the game one million times, starting before the coin flip each time, then approx 50% of the time you win with heads, and the other 50% of the time you lose with tails.

After the coin is flipped, the either P(heads) = 1 or P(tails) = 1, you just don't know which. If you play the game one million times, once the coin has been flipped, the either you are going to win one million times in a row, or you are going to lose one million times in a row. The expected outcome of the game is more accurately stated as "either you win $1 each time or you lose $1 each time", not "you break even".

Second, to address the first part of your post:

Quote:
But I have no information about whether I'm playing game A or game B. Assuming that my choice of which envelope to label 'X' is independent of which has more money, then if I played the game a million times, about 500,000 of them would be game A and the remainder game B.

When you choose to calculate your expected return by labeling one envelope as "x", you are calculating the average return if, given the same 2 envelopes, you always pick out the envelope that you labelled "x". If you played the game 1 million times, then 1 million times you are getting the envelope labeled "x".

Title: Re: HARD: ENVELOPE GAMBLE
Post by pedronunezmd on Jun 6th, 2004, 7:49am
In response to Asterix's recent post:

What you have posted, at least in the first 80% of your post, is another way to restate the solution first proposed by S.Owen in the very first post of this thread, which is the "correct" way to figure out the expected payoff of the game.

I hope that everyone agrees that the expected gain by switching is zero, as stated by S.Owen. If not, my offer still stands to challenge anyone who disagrees with recreating the game as a computer program.

The challenge of this riddle, however, is not to state what the "correct" way to calculate the gain from switching is, but to show why the method proposed in the riddle is "not correct". For example, if I showed you a proof that 0=1, then it is not complete to just say "wrong! I know that 0 does not equal 1", but it would be complete to say "here is the flaw in your proof that 0=1".

This is what I'm trying to do. The reasoning in the riddle is wrong. To classify this riddle as "solved", we will need to show why the reasoning is wrong.

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Jun 6th, 2004, 11:40pm

on 06/06/04 at 07:38:08, pedronunezmd wrote:
The expected outcome of the game is more accurately stated as "either you win $1 each time or you lose $1 each time", not "you break even".
This is not true, because 'expected outcome' is one single value, not a statement. It's [sum] p(X=a)*a
If P(X=x)=1, then the expected outcome is x, if P(X=y)=1, then the expected outcome is. If X is either x or y, then you have to look at the chance of either state being the case, and here P(X=x|x or y) = 1/2 and P(X=y|x or y) = 1/2 (as said before), and therefor the expected outcome is (x+y)/2


Quote:
This is what I'm trying to do. The reasoning in the riddle is wrong. To classify this riddle as "solved", we will need to show why the reasoning is wrong.
It has allready been shown several times why it is wrong, and what mistake is made.
The X from (X/2, X) isn't the same X as in (X, 2X), because it's a given that the two pairs are the same (because it's the same envelope)

Title: Re: HARD: ENVELOPE GAMBLE
Post by carpao on Jun 7th, 2004, 9:13am

on 06/06/04 at 07:49:48, pedronunezmd wrote:
If not, my offer still stands to challenge anyone who disagrees with recreating the game as a computer program.


If someone is interested ... avery somple Java Program...


Code:
public class gambler {

     
     static public void main (String[] args)
     {
           int i;
           int[] j;
           int x;

           int switchTotal = 0;
           int noSwitchTotal = 0;
           int c0=0;
           int c1=1;
           
           j = new int[2];
           
           for (i=0;i<10000;i++)
           {
                 //extract a number j between
                 j[0]= (int) (StrictMath.random()*10000);
                 j[1]= j[0]*2;
                 

                 
                 //choose randomly one envelope
                 x= (int)(StrictMath.random()+0.5);
                 
                 if (x==0)
                       c0++;
                 else
                       c1++;
                 
                 
                 //what about switching?
                 switchTotal += j[(x+1)%2];
                 
                 //what about not switching?
                 noSwitchTotal += j[x];      
           }
           
           System.out.println("NoSwitch:"+noSwitchTotal);
           System.out.println("Switch:"+switchTotal);
           System.out.println("X = 0:"+c0);
           System.out.println("X = 1:"+c1);
           
     }
}

Title: Re: HARD: ENVELOPE GAMBLE
Post by Leon on Jul 23rd, 2004, 11:21am
It doesn't differ much from this thread, but in case anyone is interested:

http://www.maa.org/devlin/devlin_0708_04.html

Title: Re: HARD: ENVELOPE GAMBLE
Post by JocK on Jul 23rd, 2004, 4:20pm
Before opening the envelope, the expectation value for the payout is infinity (it has to be, or else P(X=high|picked=X) = P(X=low|picked=X) can not be true for all X). So, the other envelope contains twice infinity or half infinity. Swapping brings no advantage because infinity is infinity, regardless whether you multiply by 2 or by 1/2.

The paradox disappears when using an a-priori distribution with a finite expectation value. (E.g. n dollars is put in the first envelope with probability (1/2)^n, (n=1, 2, ...), and double the amount of the money in the first envelope is put in the second envelope.)

J8)CK

Title: Re: HARD: ENVELOPE GAMBLE
Post by JocK on Jul 23rd, 2004, 4:45pm

on 07/23/04 at 16:20:47, JocK wrote:
The paradox disappears when using an a-priori distribution with a finite expectation value.

This brings me to a variant of the two-envelope paradox that nicely stresses that switching paradoxes only occur when the expected pay-out for both envelopes is infinite.:

The St. Petersburg Two-Envelope Paradox

You are presented with two envelopes. You are told that each of them contains an amount determined by the following procedure, performed separately for each envelope: starting with 1$ a coin was flipped until it came up heads. Each time a tail showed the amount was doubled. So, if the first nth trials showed tail, but head came up in the (n+1)th trial, an amount of 2^n was put into the envelope. This procedure is performed separately for each envelope. You randomly select an envelope, and are offered the options of keeping this envelope or switching to the other. What should you do?

You might reason as follows:
- Before opening the envelopes, the expected value in each is infinite.
- For any x, if you knew that your envelope contained x, then the expected value in the other would still be infinite.
- So for all x, if you knew that yours contained x, you would have an expected gain in switching to the other envelope.
- So you should switch. But this seems clearly wrong, as your information about both envelopes is symmetrical and following this line of thought you should keep switching between both envelopes.

J8)CK

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Jul 26th, 2004, 11:29am
There is no strategic benefit to switching - there is a 50%chance of receiving the higher of the two sums of money - regardless of your decision.

Proof:

Show me an argument that illustrates the benefit in switching, and I'll show you the same argument that illustrates the benefit in not switching.

For example:  Suppose by switching, you stand to gain double the amount that's in your hand, with the risk of losing half of what's in your hand.  Clearly this must be a benefit because the prospective gain is higher than the potential loss.  Right?

Wrong.  If we reword it and say that after selecting an envelope, you have the option of keeping the envelope that you have or defaulting to the other envelope, we could make the same argument:  By keeping the envelope we have, we stand to walk away with twice the amount that's in the other envelope, and we stand to miss out on half the amount that's in the other envelope.  This time it is clearly beneficial to keep what we have.

Since these two arguments - though contradictory - are balanced, the odds of either choice HAS TO BE 50% and therefore there must be a flaw in the analysis which shows any bias, one way or the other.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 5th, 2004, 3:42pm
Obviously, I agree with S Owens first post.

I have 3 points about why there is an apparent paradox.

I think my points are so similar to what has already been said, that I would not bother were it not for the fact that some posters are saying that the paradox has not been properly explained.  So I will give it a go.


Point 1

Let's say the 2 envelopes are A and B, and A is the one you hold.

If you wish to analyse the problem by defining that one of the envelopes contains x, then you have the following possibilities.

1.  A=x, B=0.5x.
2.  A=x, B=2x.
3.  A=0.5x, B=x
4.  A=2x, B=x

So there are 4 possibilities, all equally possible.

If you swap, your gain, G, is

G = 0.25 (-0.5x) + 0.25 (x) + 0.25 (0.5x) +0.25 (-x) =0

The probability equation in the riddle ignores possibilities 3 and 4.


Point 2

If it were true (which it isnt) that swapping produced an increase in your chances of winning, then it would be true that swapping back would also increase your chances etc.

That the original riddle has its definitions wrong is easliy shown (and has been already in this thread).

Following the riddle's analysis you could say to yourself, I have Y in this envelope, the other envelope has 50% chance of having 0.5Y and 50% of having 2Y and so I will swap.

But if you remembered your last argument, pre-swap, you would think the other envelope has X.  My envelope therefore has 50% chance of having 0.5X and 50% of having 2X and so I will not swap.

We do not know what X and Y are and so the analysis is meaningless.

(I believe, Mattian, this is the point you have just made).


Point 3

It will always be worth swapping after opening the envelope (if we ignore real word considerations) because your potential gain is twice your potential loss.

The reason point 1 no longer applies is that now we can ascribe a particular value to x.

Say the envelope contains $100.

We can have x=$100, and use possibilities 1 and 2 from above and ignore 3 and 4.

The reaon point 2 no longer applies is obvious.  Once we have swapped, that is the end of the game.  We know for certain if we have got $50 or $200.  The question of whether it is better to swap again does not arise.



Title: Re: HARD: ENVELOPE GAMBLE
Post by BNC on Sep 5th, 2004, 5:54pm

on 09/05/04 at 15:42:53, Padzok wrote:
...
Point 3

It will always be worth swapping after opening the envelope (if we ignore real word considerations) because your potential gain is twice your potential loss.

The reason point 1 no longer applies is that now we can ascribe a particular value to x.

Say the envelope contains $100.

We can have x=$100, and use possibilities 1 and 2 from above and ignore 3 and 4.

The reaon point 2 no longer applies is obvious.  Once we have swapped, that is the end of the game.  We know for certain if we have got $50 or $200.  The question of whether it is better to swap again does not arise.


But don't you see the problem? Opening the envelope did not give any "important" (or relevant) information. So you found $100... your decesion would not have changed if it were $50, $200, or any other value. The sum of money in the envelope does not change the outcome, thus (IMHO) does not matter.

In other words, instead of opening the envelope, why not guess a sum (say $100) -- as it wouldn't matter anyway -- and go on from there?

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 6th, 2004, 7:31am

on 09/05/04 at 17:54:08, BNC wrote:
Opening the envelope did not give any "important" (or relevant) information.

It depends what information you consider "important". To borrow language from Quantum Mechanics, before opening either envelope, both exist in a (symmetric) superposition of states. Once you open one envelope, you collapse the waveforms of both envelopes: the opened one to a single, definite value; the other to a superposition of two values.

If you prefer, you can look at it this way. Before opening the envelope, you have two absolutely symmetrical envelopes. By opening the envelope, you destroy the symmetry - one envelope is open, and the other closed. The open envelope has an absolutely known quantity of money inside, while the other has a chance of being worth more (dependent upon the distribution of possible values for the envelope contents).

As (yet) another way of looking at it: if you consider finite uniform distributions, you get a maximum possible value, for which you won't switch, but whatever else you open, you will switch. As the distribution gets wider, the chance of opening the value that causes you not to switch heads to 0, so the chance of not switching also heads that way, but, for any finite range of values, there's still the possibility of opening the counter-example.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 6th, 2004, 12:17pm
on Sep 5th, 2004, 5:54pm, BNC wrote:
Opening the envelope did not give any "important" (or relevant) information.  

It gives you 2 pieces of information.

1.  Is that one of the envelopes contains $100
2.  Is that the envelope you hold contains $100

If all you knew was that one of the envelopes contained $100, then the possibilities would be:

You have $100, other is $50
You have $100, other is $200
You have $50, other is $100
You have $200, other is $100

You are right, BNC, to say that there is nothing special about $100.  You could write out the 4 possibilities for any positive integer.

The general case is as I had in last post,

1.  A=x, B=0.5x.
2.  A=x, B=2x.
3.  A=0.5x, B=x
4.  A=2x, B=x

where A is the envelope you hold.

Alternatively you can swap and then hold envelope B (without opening) and the possibilities remain unchanged.  

The formulation given for working out the probabilities in the original post ignores the symmetry, and the problems which arise from that are as described in my last post (and by mattian in the post before that).

As rmsgrey has said, opening one envelope destroys the symmetry.  


What is an acceptable solution to the riddle, bearing in mind that the riddle is apparently to show the flaw in the logic of the original argument?

I think several different explanations have been offered.  There is no reason that more than one might not be correct.  

Has anybody shown that ALL the arguments  explaining the flaw in the riddle's logic are wrong?  If not, then has the apparent paradox been satisfactorily explained?




Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 7th, 2004, 3:09am

on 09/05/04 at 15:42:53, Padzok wrote:
Point 3

It will always be worth swapping after opening the envelope (if we ignore real word considerations) because your potential gain is twice your potential loss.

Let's say one of the envelopes contains $50 dollar, and the other a $100. And let's say you don't know that, but only know that one envelope contains twice as much as the other.

If you pick the envelope with $50 there is no actual chance the other contains $25 , nor if you pick the one with $100 is there any actual chance the other contains $200.
Your potential gain, and your potential loss are both $50 dollar. So opening the envelope doesn't make switching a good option all of a sudden. It's just as irrelevant as before.

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Sep 7th, 2004, 6:16am
I still think the simplest explanation is given by demonstrating the converse:

By rewording it, we can say:  Having chosen an envelope you are then given the opportunity to keep the envelope or opt for the contents of the other envelope.  Is there any benefit to keeping the envelope?

Using the same 'reasoning' employed by those who believe in switching, I could argue that there is a benefit to keeping the envelope.

Rewording the puzzle shouldn't affect the mathematics of the puzzle - and it doesn't - the solution is simply that there is no benefit nor disadvantage to switching.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 7th, 2004, 7:24am
Towr:

If we're discussing alternative versions of the riddle, how about the following:

You go on a gameshow where half the time the prizes are $50 and $100, and the other half, they're $100 and $200. You pick one envelope, and then the host, who knows which envelope contains $100 always opens the envelope containing $100. You then get a chance to change your mind. Should you switch a) when your envelope contains $100, b) when the other envelope contains $100?

I think part of the problem with the "opened" question is that the original, "unopened", question (correctly) regarded the distribution of possible values as irrelevant. To analyse the "opened" question fully, you'd need either to assume a given distribution and see what the correct choice is then, or make assumptions about the range of possible ditributions, and how they are distributed (a meta-distribution problem) and solve the (much harder) question for that, or abstract even further into meta-ndistributions...

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 7th, 2004, 12:56pm

on 09/07/04 at 03:09:42, towr wrote:
Let's say one of the envelopes contains $50 dollar, and the other a $100. And let's say you don't know that, but only know that one envelope contains twice as much as the other.

If you pick the envelope with $50 there is no actual chance the other contains $25 , nor if you pick the one with $100 is there any actual chance the other contains $200.
Your potential gain, and your potential loss are both $50 dollar. So opening the envelope doesn't make switching a good option all of a sudden. It's just as irrelevant as before.


Well, if you open your own envelope and it contains $100 then, as far as you know, your maximum gain from swapping is $100 and your maximum loss is $50.  You have a 50% chance of either outcome (again as far as you know), and so swapping is worthwhile.

On other hand, if the other envelope is opened and contains $100 then now your maximum loss is $100 and maximum gains is $50 (as far as you know) and so swapping is NOT worthwhile.

Let me restate the original conundrum as follows.

1.  If you have 2 unopened envelopes, one containing twice the cash of the other, and you hold one, is the sensible play to swap envelopes?  (Y = swap, it increases my chances of winning more cash;  N = don't swap, it reduces my chances of winning the most cash;  D = doesn't matter, don't care, why bother?).

2.  If you have the same set up, but one envelope is opened, is the sensible play to swap envelopes? (Y, N, or D).

3.  If you have answered the same way to both of the above (2 Ys, or 2 Ns, or 2 Ds) then perhaps your work here is done.  But feel free to say how you think the 2 scenarios inter-relate.  On the other hand, if you have 2 different answers, then perhaps you should say why the answers are different.

My answers are:

1. D
2. Y if your own envelope has been opened; N if it is your opponent's.



1.  EXPLANATION

You can say the envelopes contain x and 2x if you wish.

You have a 50% chance of holding either one.  

Swapping will gain or lose x.

Therefore 50% of the time you will lose x by swapping and 50% of the time you will gain x.

Overall there is no gain or loss.

Alternatively

You can just as easily say the envelopes contain x and 0.5x.

The above analysis is repeated exactly with you gaining 0.5x 50% of the time by swapping, and losing 0.5x 50% of the time.  So the overall gain is zero.

Alternatively

You can say that you know one of the envelopes contains x and you do not know what the other has.  

The other envelope either contains 05.x or it contains 2x, and you do not know which.  50% of the time the other envelope will have 0.5x, and 50% of the time it will have 2x.  But also 50% of the time you will have the envelope with x and 50% of the time you will have the other one.

So
a)  25% of time, you have x, other has 2x.  (Swap gain x)
b)  25% of time, you have x, other has 0.5x. (Swap lose 0.5x)
c)  25% of time, you have 2x, other has x. (Swap lose x)
d)  25% of time, you have 0.5x, other has x. (Swap gain 0.5x)

Overall, no benefit in swapping.

Alternatively

You can be quite basic and non-mathematical.

Pre-swap, both I and my opponent have an envelope whose contents are unknown.  After any swap, both I and my opponent have an envelope whose contents are unknown.

There is therefore no observable difference between the scenarios and swapping has made no difference.

It is to be noticed that all the above mathematical explanations (and the other) involve no assumptions about which envelope you hold.  Therefore, if you did choose to swap, the analysis could be repeated without change in the new scenario.


2.  EXPLANATION

Let's say your envelope is opened.  Let's say it contains x.

There is now a 50% chance the other one contains 2x, for a gain of x by swapping.

There is a 50% chance the other one contains 0.5x, for a loss of 0.5x by swapping.

If you start off with x every time and play a million times, then clearly you will have gained by swapping, compared to not swapping.

This game is actually a different one to the original.

Here imagine the gameshow host starts off with 3 envelopes each time.

The 3 envelopes contain 0.5x, x, 2x.

Every time the game is played, he gives you the envelope with x.

He then picks one of the remaining 2 envelopes completely at random and gives it to your opponent.  

So long as you know that you will be given the same amount each time (x= any finite positive constant) then actually finding out what x is by opening the envelope is irrelevant.  

If you play the game a million times then swapping makes you a profit (well, OK, it almost certainly makes you a profit) for any value of x.

Hopefully, this analysis demonstrates that actually knowing what "x" is is unimportant.  What is important is that it is fixed.  Opening the envelope is what fixes it (in the scenario where the original game is played once).  That is the only significance of opening the envelope (in the original game).  The actual amount (ie whether it is $100, or $200, or $1,000,000,000) is not important in itself.


The above analysis is not symmetrical.  Therefore if the other envelope (your opponent's) is open it does matter.   I do not think you should swap in that case, because the reverse reasoning applies.


At the top of the post, I applied real values.  Here they are again.

If you know you already have $100, and you swap 100 times.
You gain 50 x $100 = $5000 (on average).
You lose 50 x $50 = $ 2500 (on average).
You gain (on average) $2500 by playing $100 times.

You know this as soon as your envelope is opened and reveals $100 inside.  You do not know beforehand.  So I do not see how the situation with the open envelope can be described as no different to 2 unopened envelopes.


If you know your opponent already has $100, and you swap 100 times.
You lose 50 x $100 = $5000.
You gain 50 x $50 = $ 2500.
You lose (on average) $2500 by playing $100 times.

Again, you know this after the oppoent's envelope is opened but not before.



3.  JUSTIFICATION OF DIFFERENCE

For me the hard part about trying to justify why there is a difference is that it is self-evident that the potential for difference arises as soon as more info is provided to you.  

To say that your having the information should make no difference to your choice merely because the info existed (in an abstract sense, or else was known to a hypothetical third party) before you had it in your possession seems to ignore our everyday experiences.

The OPEN envelope scenario gives you more information than the CLOSED envelope scenario.  Why is it surprising that your best play changes according to the available info?

To give a non-mathematical example, say I ask you if you're going to bet on horse A in the next race.

You say, "Sure I am.  His times are vastly superior to the other horses.  Taking into account the odds, I think I will gamble."

But then I tell you:

a) the odds have changed; he is now short-priced favourite, or
b) his new secret times have come into my possession, he is slower than you thought, or
c) 5 new horses have been added to the field, or
d) he was in fine fettle yesterday, but has a fever this morning.

In any of the above, you have more info and so your decision may change.

Before I gave you the information (d, say) it was still true that the horse had a fever this morning.  The only thing which has changed is that you now have the info in your possession.  Once you have the info, your best move as you see it changes.













Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 7th, 2004, 1:25pm

on 09/07/04 at 12:56:21, Padzok wrote:
Well, if you open your own envelope and it contains $100 then, as far as you know, your maximum gain from swapping is $100 and your maximum loss is $50.
No, what I know is that I either gain as much as is in the envelope with the least money, or I loose as much as in the envelope with the least money. And that remains the same, since there are only those two envelopes.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 7th, 2004, 1:39pm

on 09/07/04 at 13:25:36, towr wrote:
No, what I know is that I either gain as much as is in the envelope with the least money, or I loose as much as in the envelope with the least money. And that remains the same, since there are only those two envelopes.


Towr, I must be honest and say I do not understand what you mean.

There is a 50% chance that the envelope with the least money contains $100.

So you say there is a 50% chance you either gain or lose $100?

Why?

What about the other 50% of the time?

Do you mean that in that case the least amount of money is $50 and so you either gain or lose $50?

Why?

(I'm sure you won't equate brevity with rudenesss...it's not intended that way)

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 7th, 2004, 2:53pm
There are only two envelopes on the table, you can't switch to a third one since it doesn't exist. One of the two envelopes has the least amount $L in it. If you happen to pick it, and switch you gain $L, if you happened to pick the other one, and switch you loose $L.
If the two envelopes on the table contain $50 and $100, then L=50, regardless of whether you've opened either envelope. And while you might not know it is 50, I do know it doesn't change.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 7th, 2004, 3:35pm

on 09/07/04 at 14:53:35, towr wrote:
There are only two envelopes on the table, you can't switch to a third one since it doesn't exist. One of the two envelopes has the least amount $L in it. If you happen to pick it, and switch you gain $L, if you happened to pick the other one, and switch you loose $L.
If the two envelopes on the table contain $50 and $100, then L=50, regardless of whether you've opened either envelope. And while you might not know it is 50, I do know it doesn't change.


But what are you saying?

Are you saying I am playing the game, and you know (but I do not) that there is $50 and $100 in the envelopes?

In which case I agree that you will know the following:

1.  When I change when the envelopes are unopened, you know I am either notionally losing or notionally gaining $50 each time.

Presumably you agree that you do not know which?

Presumably you agree that any gains or losses remain notional as long as the envelopes remain closed?

I am asserting that in this case there is no benefit to me in swapping the envelopes once or twice or three times or 99 times or a hundred times.  Do you agree with that?


2.  When I open one of the envelopes, you will see it either has $50 in it or $100.

From your point of view the problem is now completely solved.  You have all the information anybody could want.

You know that if I have the $50 envelope and I choose to swap, I will gain $50.  You know that before I make any decision, but you only know after I open the envelope.

You have set the conditions so that if I play the game 100 times, then, on average:

I will get $50 envelope 50 times and swap gaining 50 x $50=$2500
I will get $100 envelope 50 times and swap losing 50 x $50=$2500
for an overall average of nil.

By setting both envelopes, as being completely determined as soon as I open one of them, you have invented a new scenario.

Do you agree?

Because if you are only thinking about the scenario where the envelopes remain closed, then I think we agree swapping means nothing.




Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 8th, 2004, 12:52am

on 09/07/04 at 15:35:33, Padzok wrote:
But what are you saying?

Are you saying I am playing the game, and you know (but I do not) that there is $50 and $100 in the envelopes?
Well, the actual amount isn't all that important, just that there is a predetermined amount in both envelopes, and the content of the one is worth twice as much as the other.
I thought it might be easier to talk amount if the amounts where specified.



Quote:
In which case I agree that you will know the following:

1.  When I change when the envelopes are unopened, you know I am either notionally losing or notionally gaining $50 each time.

Presumably you agree that you do not know which?
If I can't distinguish the envelopes or don't know which you choose, then no, I don't know what is the actual case.


Quote:
Presumably you agree that any gains or losses remain notional as long as the envelopes remain closed?
Well, as long as no choices have been made. If a choice has been made, everything is actual, even if I don't know what the reality is. But it's hardly the point. After giving it some more thought last night I think I can get to it later on in this post.


Quote:
I am asserting that in this case there is no benefit to me in swapping the envelopes once or twice or three times or 99 times or a hundred times.  Do you agree with that?
Yes.


Quote:
2.  When I open one of the envelopes, you will see it either has $50 in it or $100.

From your point of view the problem is now completely solved.  You have all the information anybody could want.

You know that if I have the $50 envelope and I choose to swap, I will gain $50.  You know that before I make any decision, but you only know after I open the envelope.
Yes, but of course God knew before that ;) The actual overviewer isn't important, nor if there is one, just as long as the universe doesn't change unpredictably.


Quote:
You have set the conditions so that if I play the game 100 times, then, on average:

I will get $50 envelope 50 times and swap gaining 50 x $50=$2500
I will get $100 envelope 50 times and swap losing 50 x $50=$2500
for an overall average of nil.

By setting both envelopes, as being completely determined as soon as I open one of them, you have invented a new scenario.
No, they are determined before they are opened. Someone has taken two envelopes, but money in them, and lay them on the table. From that moment the amounts in them are determined and don't change, neither before, nor after you open one of them.


Quote:
Do you agree?
obviously not :P


Quote:
Because if you are only thinking about the scenario where the envelopes remain closed, then I think we agree swapping means nothing.
Ok, tiem to get to the point.. We can seperate two elements of the problem.
We are looking for the expected gain of switching. The expected gain is the probability of gain times the possible gain minus the probability of loss times the possible loss
E(G) = P(G)*G - P(L)*L
Now what I've been trying to get across is that G=L
Once the two envelopes are filled with money and lie on the table the amount you can gain by switching is equal to the amount you can loose by switching, because you can only go from the one actual state to the other or back.
What I've been ignoring, and what rmsgrey reminded me of was, P(G) and P(L). Now if you _know_ one envelope contains $50 and the other $100, then obviously if you open the one with $50 you will switch. Because you know the other must contain the $100. P(G) is then 1 (you know for certain you will gain by switching), and P(L)=0, so E(G) = 1*$50-0*$50=$50.
If however you don't know anything about the distribution of money in the envelope, and assume both are equally likely, then P(G)=P(L)=0.5, and E(G)=0, and then it doesn't matter whether you switch or not.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 8th, 2004, 5:02am

on 09/08/04 at 00:52:27, towr wrote:
Ok, tiem to get to the point.. We can seperate two elements of the problem.
We are looking for the expected gain of switching. The expected gain is the probability of gain times the possible gain minus the probability of loss times the possible loss
E(G) = P(G)*G - P(L)*L
Now what I've been trying to get across is that G=L
Once the two envelopes are filled with money and lie on the table the amount you can gain by switching is equal to the amount you can loose by switching, because you can only go from the one actual state to the other or back.
What I've been ignoring, and what rmsgrey reminded me of was, P(G) and P(L). Now if you _know_ one envelope contains $50 and the other $100, then obviously if you open the one with $50 you will switch. Because you know the other must contain the $100. P(G) is then 1 (you know for certain you will gain by switching), and P(L)=0, so E(G) = 1*$50-0*$50=$50.
If however you don't know anything about the distribution of money in the envelope, and assume both are equally likely, then P(G)=P(L)=0.5, and E(G)=0, and then it doesn't matter whether you switch or not.

OK, so time to get down to basic definitions in probability. Firstly, I hope we can all agree that:
E(V)=P(W)*W-P(L)*L
(I prefer not to use G for two different concepts so I'm using V for value and W for win).
Secondly, the precise value of P(X), correctly evaluated, is dependent on who evaluates it (more precisely on the information available to the person who evaluates it). For instance, if I pick a card from a standard 52 card pack and look at it, then you would correctly evaluate P(Ace) at 1/13. I on the other hand, knowing that it is the ace of diamonds, would correctly evaluate P(Ace) as 1. A way to look at it is that P(X), as evaluated by you, is the proportion of situations where you have exactly the same (relevant) information and it turns out that X is the case.

For the envelope game, from the point of view of someone who knows in advance what the two values in the envelopes are, but not which is which, before opening, the expected gain from switching is 0, while after opening, the expected gain is either +L or -L, depending on whether the opened envelope contains L or 2L

From the point of view of someone who only knows that one envelope contains twice the other, again, before opening, E(V)=0. Once the envelope you pick is opened to reveal some amount, K, the question becomes whether you opened the lower of game {K,2K} or the higher of game {K/2,K} - of every 100 times you open an envelope containing K, some number, 100*P(W), will be the game {K,2K} and 100*P(L) will be the game {K/2,K}. So in this case:

E(V|K)=P(W|K)*(2K-K) + P(L|K)*(K/2-K)
=P({K,2K}|K)*K - P({K/2,K}|K) * K/2
= ( 2P({K,2K}) - P({K/2,K}) ) * K / ( 2(P({K,2K})+P({K/2,K}) )
which is positive precisely when 2P({K,2K}) - P({K/2,K}) is, meaning that for a given K, you should switch when it's at least half as likely that the envelopes originally contain K and 2K than it is that the envelopes originally contain K/2 and K.

It doesn't matter whether the envelopes were filled a thousand years ago, this morning, or as you decide to open them (provided the probablity of each outcome remains independent of your choice) - as long as the probabilities of the other envelope containing 2K or K/2 are the same, the conclusion should be the same.

Another variation on the same game: You have 100 envelopes, some number of which, 100*P(W), contain cards with a plus sign on; the remainder (100*P(L)) having minus signs on their cards. You pick an envelope, without opening it, and the host offers, without knowing the contents, to buy it from you for $100. If you open the envelope, then a plus sign is worth $200, and a minus sign worth $50. I hope it's obvious in this game that, provided 2P(W)>P(L), you should gamble, while 2P(W)<P(L) you should take the $100, despite the fact that the production assistant who put the cards in the envelopes already knows what's in the envelope you're holding... The question, of course, is how well this reflects the situation when you open an envelope in the original game and find $100 inside...

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 8th, 2004, 5:34am

on 09/08/04 at 05:02:21, rmsgrey wrote:
From the point of view of someone who only knows that one envelope contains twice the other, again, before opening, E(V)=0. Once the envelope you pick is opened to reveal some amount, K, the question becomes whether you opened the lower of game {K,2K} or the higher of game {K/2,K}
I disagree, the question is wether K=L, or K=W of game {L,W}
And E(V) = P(K=L)*(W-L) + P(K=W)*(L-W).
There aren't two games, but only one.

gah.. *rips out hair* this is getting us nowhere..

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 8th, 2004, 2:21pm

on 09/08/04 at 05:34:19, towr wrote:
There aren't two games, but only one.

gah.. *rips out hair* this is getting us nowhere..


Towr, I think I follow your points.

To paraphrase you say:
1.  The gameshow host or his assistant knows how much is in the envelopes, (k and 2k say);
2.  It does not matter that the contestants do not;
3.  You have a 50% chance of getting the higher envelope, and a 50% chance of getting the lower.  Higher and lower have been fixed at 1 above.  
4.  Swapping therefore means you either gain k or lose k.
5.  Swapping is exactly the same if both envelopes are closed or if one is open.

All good points, and all true.  But they are of interest to an historian, not to a gambler.  The facts are only known to the contestant (as opposed to the host) after the end of the game and they are not much use then!

Let's say I play.  I open an envelope and find $100.  I decide to gamble and swap.  The other envelope has $50.  After the game has been played, some guy comes up to me in the bar and says, "If you had not swapped, you would be $50 better off."  I thank him politely for his contribution to the sum total of my knowledge.

Who would take the following bet?

I (Person A) take $10 out of my pocket and you (Person B) toss a coin several times.  

Perhaps:

If the coin is heads, I give you the $10 and you give me $20.
If the coin is tails, I give you the $10 and you give me $5.

Other times, we vary the bet so that:

If the coin is tails, I give you the $10 and you give me $20.
If the coin is heads, I give you the $10 and you give me $5.

1.  Sometimes you first nominate which of the above bets you will offer, then I agree to it (or refuse), then you toss the coin.

2.  Sometimes you  offer the bet, then you toss the coin (and look at it, but I don't) and then I decide whether or not to agree to the bet.

3.  Sometimes you toss the coin but do not look (I trust you ::) ), then you choose which of the above bets to offer me and I choose if I want to accept or not.

I would always be happy to take the gamble in any of the above scenarios.  

Is there anybody reading this thread who would refuse any of the gambles if they were person A?

In scenario 1 - neither of us know if heads or tails will come up when we choose the bet (and I choose  whether to accept or refuse).

In scenario 2 - you know if its heads or tails when I choose  whether to accept or refuse.  You therefore know if I will win or lose.

In scenario 3 - God (or maybe an independent 3rd human in the room) may know if it is heads or tails when I choose  whether to accept or refuse the bet, but that's no use to me.  I do not know (and nor do you).






Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 8th, 2004, 11:26pm

on 09/08/04 at 14:21:25, Padzok wrote:
All good points, and all true.  But they are of interest to an historian, not to a gambler.  The facts are only known to the contestant (as opposed to the host) after the end of the game and they are not much use then!
I disagree, If I were the contestant I'd know those points were true as soon as the game starts. I would not know the value of k, but I'd know it exists and is fixed.

Besides, are you honestly saying that you should always switch? Pick either of the two envelopes, look inside, and then switch? Because that means you always end up with the envelope you didn't pick first, which means you might as well have picked that one first and left the game.

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 9th, 2004, 2:37am
How about this version of the gamble.

In this game again there are the two envelopes, and again one contains twice the amount in the other.
You can pick either and look inside. If you want to switch however you have to pay 10% of what's inside the envelope you opened. Also at the end of the game you'll have to pay the entree fee for the game, which is 95% of the average of the two envelopes.

'Clearly' ;) if you open an envelope and there's $100 inside it. There's 50% chance by switching you'll get $200, at a cost of $10 + 95%*$150, a gain of $47.50
On the other hand there's a 50% chance you'll switch to $50, at a cost of $10 + 95%*$75, a total loss of $31.25
So you'd expect to win $8.125 on average (whenever you open $100), or generally an 8.125% profit. ::)

Me, I'd be happy to run this scam, if you're willing to bet :P
Because if you always switch, you'll pay 105% of the average of the two envelopes, while gaining only the average.
If the envelopes on the table are $50 and $100, you'll pick $50 half of the time, and pay $5 to switch to $100, a total gain of $23,75 for you. And the other half of the time you'll have picked $100 and pay $10 to switch to $50, a total loss $31.25. On average a loss, for you, of $3.75, in other words a neat 5% profit for me.  ;D

[e]Of yeah, almost forgot. IF you decide not to switch ever, you'd have a profit of 1/19th, or about 5.26%, a 5% loss for me, so don't do that ;)[/e]

[e2]Maybe an entree fee of 88%, and 24% switch fee would be more appropriate. It'd also give me a bigger profit :P
I'm sure you'll find out why that might be better all-round.[/e2]

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Sep 9th, 2004, 8:59am
Towr - COME ON !!!!

We're trying to prove the earth is spherical and you go and throw a spanner in the works by erecting mountains and valleys on it.

Let's first avoid the witchhunters before we throw a whole bunch of detail into our "witchcraft".


Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 9th, 2004, 12:50pm

on 09/08/04 at 23:26:06, towr wrote:
I disagree, If I were the contestant I'd know those points were true as soon as the game starts. I would not know the value of k, but I'd know it exists and is fixed.


OK.  I do see that.  

Say you open the envelope and it contains $100.

You know that you either have the higher amount (H) or the lower amount (L).  It is 50:50 and swapping does not increase your chances of having the higher amount.  Similarly, nothing you can do can increase (or decrease) the amount of money in either envelope.  So there are only 2 possibilities, you either finish with H or you finish with L.

But even though you know all that, the question you have to decide is are you willing to take the gamble of swapping, knowing there is a 50% chance you will win/improve and 50% chance you will lose/decrease.

If you lose you are $50 worse off.  If you win, you will be $100 better off.

The fact that other people (but not me) already know which of the two it will be if I decide to swap does not put me off such excellent odds.


on 09/08/04 at 23:26:06, towr wrote:
Besides, are you honestly saying that you should always switch? Pick either of the two envelopes, look inside, and then switch? Because that means you always end up with the envelope you didn't pick first, which means you might as well have picked that one first and left the game.



Again, I think you are right.  

For example, lets take 10 pairs of envelopes (for each pair one is x, one is 2x, but x is different for each pair).

The simulation is that player A selects an envelope from each pair at random, and then swaps each time and the other one is the winner.

The simulation is run a million times.

Then we use the same 10 pairs and run the simulation a further million times, but this time Player A picks an envelope and keeps his first pick.

The 2 different methods for Player A will make no difference to the average money he walks away with (which I presume is the sum of 1.5x for each of the 10 pairs).

But to answer TOWR's question, yes I would swap everytime I played the game and was holding the open envelope.

I know that I am taking a one-off gamble, not playing a million times with the same 10 pairs, and I like the odds.  I like them no less than in the coin toss gambles I mentioned earlier.

If I knew that I was playing a million times with the same envelopes, or same 10 pairs, I would know strategy would be irrelevant.


One of the fundamental problems with the whole scenario is that I start off with zero, and play a game.

I know that when the game is over, I will have either K or 2K whatever I do (and I do not know what K is).

I literally cannot lose.

I open and its £100.

If I swap and go from high to low, then I will be a little disappointed but I still will have won something.  

I am $50 better off than when I started the game, even though at one point during the game (but before it was over) I was $100 better off.

If I swap and go from low to high, then I will be overjoyed.  

I am $200 better off than when I started the game, even though at one point during the game (but before it was over) I was $100 better off.  




Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 9th, 2004, 1:05pm
OK, let's try a finite distribution:

There are 1000 pairs of envelopes, the kth pair being {2k-1, 2k} (suppose for simplicity and to save my putting in so many superscripts, the actual contents of each envelope are a piece of card with a number between 0 and 1000 on (inclusive), representing an actual value of 2 to that power)

The game goes as follows:

1) A pair of envelopes is chosen at random.
2) I pick an envelope and get to look inside it.
3) I, knowing the setup, decide whether to switch or not.
4) I win the value of the envelope I switch to/stick with.

I claim that the optimum strategy here is to switch for any number except 1000, which comes up once in 2000 games.

Generalise to N pairs of envelopes. When N=1, you get Towr's model of the game. As N gets arbitrarily large, you tend towards my model of the game, where, for any finite distribution, there is always one case where I will refuse to swap, but only a 1 in 2N chance of that case ever coming up. As N goes to infinity, a) the world economy crashes, and b) the chance of my not switching goes to 0.



Suppose I accept Towr's model. The values in the envelopes are fixed: 50 and 100. Why isn't my ideal strategy to switch when I see 50, and stick when I see 100? After all, God knows that's what I should do, the production assistant knows that's what I should do. Even the annoying guy in the bar afterwards knows (afterwards) that that's the strategy I should have played with...


Or complete this sentence: "If the envelope I open contains $100, the other envelope will contain ____"

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Sep 9th, 2004, 1:14pm
Let me put my 2 cents in here again:

Padzok,

If you win, your gain will ALWAYS be twice your loss if you lose.

The point here is that switching has nothing to do with these outcomes.  You can switch if you like, but it won't improve your odds in gaining.

Let me put it in your terms and prove the converse.

You have (L) and (H) and you don't know which is which or the amounts they represent.

You're given an envelope and told not to open it.  You're then shown another envelope on the table.  You're told you may keep the envelope in your hand or default to the one on the table.

By your reasoning, if the envelope on the table contains x dollars, then I stand to gain x dollars by keeping my envelope but I only stand to lose x/2 dollars by switching.

Similarly, If the envelope in my hand contains y dollars, then I stand to gain y dollars by switching, and I only stand to lose y/2 dollars by keeping the envelope in my hand.

From where do you derive the bias in favour of switching?





Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 9th, 2004, 1:34pm
Mattian:

Padzok, Towr and I are currently discussing the variation where you do open the envelope before choosing whether to switch or not.

I don't think anyone in the current debate believes that you should switch if you don't open the envelope.

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Sep 9th, 2004, 1:37pm
Whoopsy -

I knew Towr and you were referring to the variation - but misinterpretted Padzok - thought he was referring to the original problem.

Within an unknown distribution, though, an opened envelope is no more useful than an unopened one.  But the scenario you recently posted does provide a known distribution - right?

I'm catching up - slowly.

Title: Re: HARD: ENVELOPE GAMBLE
Post by mattian on Sep 9th, 2004, 1:41pm
Rms:

Why only for 1000?

I wouldn't switch for any number greater than 500.

Or have I misunderstood you?

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 9th, 2004, 1:55pm

on 09/09/04 at 13:05:35, rmsgrey wrote:
Generalise to N pairs of envelopes. When N=1, you get Towr's model of the game. As N gets arbitrarily large, you tend towards my model of the game, where, for any finite distribution, there is always one case where I will refuse to swap, but only a 1 in 2N chance of that case ever coming up. As N goes to infinity, a) the world economy crashes, and b) the chance of my not switching goes to 0.
That one in a gazallion chance changes everything. Yet still doesn't change the per game gain/loss, but it definitely chances the probability of gaining/loosing. If you get the minimum there is a 100% chance you will gain from switching and consequently you will do so.

And of course if you don't know what the minimum and/or /maximum is it doesn't help you at all. If you then switch indiscriminately, you go back to average scoring, just like when you hadn't openened any envelope.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 9th, 2004, 2:07pm

on 09/09/04 at 13:41:46, mattian wrote:
Rms:

Why only for 1000?

I wouldn't switch for any number greater than 500.

Or have I misunderstood you?

I snuck in a logarithmic scale, so when I say 1000 I actually mean 21000, 500 would be 2500 etc.

Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 9th, 2004, 2:15pm

on 09/09/04 at 13:55:47, towr wrote:
That one in a gazallion chance changes everything. Yet still doesn't change the per game gain/loss, but it definitely chances the probability of gaining/loosing. If you get the minimum there is a 100% chance you will gain from switching and consequently you will do so.

And of course if you don't know what the minimum and/or /maximum is it doesn't help you at all. If you then switch indiscriminately, you go back to average scoring, just like when you hadn't openened any envelope.

So in practice, I pick an abitrary large integer and decide in advance that I'll switch for anything less than that and accept that moving my cut-off higher would improve my expected winnings (provided the probabilities are such that (on the log2 scale) P(K)/P(K+2)<2)

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 9th, 2004, 2:21pm

on 09/09/04 at 13:14:37, mattian wrote:
Let me put my 2 cents in here again:

Padzok,

If you win, your gain will ALWAYS be twice your loss if you lose.


I don't know what you mean.  I'm not saying your point is wrong; just that I do not understand it.


on 09/09/04 at 13:14:37, mattian wrote:
The point here is that switching has nothing to do with these outcomes.  You can switch if you like, but it won't improve your odds in gaining.

Let me put it in your terms and prove the converse.

You have (L) and (H) and you don't know which is which or the amounts they represent.

You're given an envelope and told not to open it.  You're then shown another envelope on the table.  You're told you may keep the envelope in your hand or default to the one on the table.

By your reasoning, if the envelope on the table contains x dollars, then I stand to gain x dollars by keeping my envelope but I only stand to lose x/2 dollars by switching.

Similarly, If the envelope in my hand contains y dollars, then I stand to gain y dollars by switching, and I only stand to lose y/2 dollars by keeping the envelope in my hand.


We agree.  I have all that, sometimes in very similar language, in previous posts.


on 09/09/04 at 13:14:37, mattian wrote:
From where do you derive the bias in favour of switching?


What I said was that if I was in the situation of having $100 in my hand, and somebody offered me the chance to swap with 50% chance of $200 and 50% of $50 I would take the bet.

I suggested a game in an earlier post.  All variations of  I putting up $100 and we toss a coin.   Regardless of whether it is heads or tails, I always give the $100 to you.  If it is heads, you give me $200.  If it is tails, you give me $50.

Do you want to play?

I'll play once if you like; or 10 times; or 100.

(See my earlier post for my suggestions of how we can randomise it between heads and tails, and how we can play the game even after the coin toss is known either to you or to an independent third party - or both.  I'll still play regardless of the scenario).

Title: Re: HARD: ENVELOPE GAMBLE
Post by towr on Sep 9th, 2004, 2:24pm

on 09/09/04 at 14:15:01, rmsgrey wrote:
So in practice, I pick an abitrary large integer and decide in advance that I'll switch for anything less than that and accept that moving my cut-off higher would improve my expected winnings (provided the probabilities are such that (on the log2 scale) P(K)/P(K+2)<2)

That depends on where the actual bounds lay. And I'm wondering if you get actual information out of it (like with knowing and finding the actual bounds)

Of course I can't get a simple equality to work at the moment (in the infinite case it's a problem of switching terms in series that's not absolutely converging, which I think we determined earlier, and in an other thread was a problem. For the finite case that should naturally not be a problem and yield the same result..)

Title: Re: HARD: ENVELOPE GAMBLE
Post by Padzok on Sep 14th, 2004, 11:02am

Quote:
There are 1000 pairs of envelopes, the kth pair being {2k-1, 2k} (suppose for simplicity and to save my putting in so many superscripts, the actual contents of each envelope are a piece of card with a number between 0 and 1000 on (inclusive), representing an actual value of 2 to that power)




on 09/09/04 at 14:15:01, rmsgrey wrote:
So in practice, I pick an abitrary large integer and decide in advance that I'll switch for anything less than that and accept that moving my cut-off higher would improve my expected winnings (provided the probabilities are such that (on the log2 scale) P(K)/P(K+2)<2)


I agree of course that swapping for all (k-1) for k=1 to k=1000 is the best strategy, so long as we stick when we actually do see 21000 on the card.

But is that not a feature of this particular series?  Namely 21000 dwarfs the second highest number in the series (2999 which appears twice).

In other words, is it not the very fact that we have chosen a tactic which guarantees a win for 21000 which produces the benefit,and not the fact we swap for every other number?

Indeed, according to my arithmetic (which is usually wrong) if we swapped for every single number, then the loss from 21000 to 2999 is so great that it means the entire strategy shows a net deficit.

I do think the idea of taking an arbitrary cut-off is promising.  :)

But I think that higher is not necessarily better.   :-/

For a finite series of the type rmsgrey describes, where we did not know the upper limit of k, then the most important thing is that our cut off is somewhere below the upper limit of k.

I think that will always show a profit (again my arithmetic could be wrong) for this particular type of series.  But if the cut off is even slightly too high, then we would expect a loss over a long enough time period.

Is it agreed this type of series is a very special case, though?  If I was playing this gameshow, my strategy would not be based on the assumption that the show used such a series from which to draw its envelopes.

{PS...I would still gamble on a swap though... ;D}



Title: Re: HARD: ENVELOPE GAMBLE
Post by rmsgrey on Sep 15th, 2004, 6:40am
For the finite case, always swapping has to break even - for instance, with the 1000 pairs of envelopes, if you play 2000 times, and (by some freak of circumstance) get each possible outcome exactly once, you'd get the same set of 2000 outcomes as if you never swapped - each value between 0 and 1000 twice, and the extremes once each.

For a more general distribution, the role of the unique end values is taken by the set of values less than twice the minimum value and the set of values more than half the maximum.

In general, playing with a cut-off means you win big on the cut-off value, and break even elsewhere in the long run, so for any given distribution, your winnings are solely determined by the cut-off value. With infinite distributions with unbounded expectation, the higher the cut-off value, the greater the expected winnings, but it's unclear what happens when the cut-off actually reaches infinity.

Title: Re: HARD: ENVELOPE GAMBLE
Post by Nasta on Sep 3rd, 2005, 5:38pm
nice problem! i read all the pages and was very intrigued by some posts, esp these of rmsgray. he had some really deep insight that is going right in the heart of the problem - which is - THE FORMULATION of the riddle, and THE ASSUMPTION of the player in it. Lets use 200 and 400 as the "real world" numbers. two cases, equally likely that may happen are - he gets 200, thinks i can get 400 x 0.5 + 100 x 0.5. Or he gets 400 and thinks - i will get 800 x 0.5 + 200 x 0.5 and obviously in both cases i win if i switch. truth is however that he cannot ever IN FACT get 800 or 100, and the imaginary expected value won by getting 800(which cannot happen in fact) is bigger than the imaginary expected value brought by getting 100(which cannot be achieved either). the 800 brings +400(equal to the sum in his envelope), while 100 brings +50(0.25 of the sum in his envelope). The bonus comes just because one of the assumed figures u supposedly get is twice as higher than the higher real value of the actual envelopes, and the other is twice as lower as the lower of the envelopes u get. i think that is what rmsgray meant and i hope i got it a little clearer(if not a lot more confusing :lol: ) for you now:)

Title: Re: HARD: ENVELOPE GAMBLE
Post by Nasta on Sep 3rd, 2005, 5:48pm
This is why the reasoning will fail in the long run - despite the player's thoughts bringing 500 EV from 400 = X and 250 EV from 200 = X, so 375 on average he will actually gain only 300, these 25% are only imaginary.
;D ;D ;D ;D
props to rmsgray for solving the problem ;)

p.s. no reason not to change though, both cases lead to same result, so u may swap or not as u wish...



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board