Jlall, on Jul 18 2007, 11:36 AM, said:
A: Say you have 2 envelopes filled with money. The value of one envelope is half the value of the other. These values can approach infinite $.
B: Ok.
A: Say you are handed one envelope and open it contains 5000 dollars. You are offered the choice to switch envelopes, do you?
B: Of course, the other envelope could have either 10,000 dollars or 2500 dollars. My EV is higher by switching.
Person B is obviously wrong, but I cannot find the flaw in his argument. Can you prove specifically why the last statement of person B is wrong (I'm not looking for there are 2 possible cases etc, I'm looking for specifically the flaw of person B's thinking of which there must be something and I'm missing it).
I am not sure if people have already said what I am going to say. But I will say it anyway.
The answer is, "it depends".
Unless we are given the "method" with which the numbers were chosen, we cannot tell if switching is bad.
For instance, we can pick the numbers with probabilities in such a way that, if the envelope you open contains 5000$, you are better off not switching.
By using a different "method" we can pick the numbers in such a way that, if the envelope you open contains 5000$, you are better off switching.
(If you want concrete examples, I refer you to http://www.ocf.berkeley.edu/~wwu/cgi-bin/y...14781;start=25)
Basically, there is no _always switch_ or _never switch_ which works. It is totally dependent on the underlying "method" with which the numbers were chosen.