You are given 2 boxes A and B. Each box has some unknown amount of money...

You are given 2 boxes A and B. Each box has some unknown amount of money, but it is known that one of boxes has 10 times more money than the other. You take box A and open it. There is $1000. You have a choice to take box B instead of A. Should you take box B or stay with A?

Other urls found in this thread:

en.wikipedia.org/wiki/Two_envelopes_problem
brilliant.org/wiki/two-envelope-paradox/
mathoverflow.net/questions/9037/how-is-it-that-you-can-guess-if-one-of-a-pair-of-random-numbers-is-larger-with
math.stackexchange.com/questions/929702/hyperreal-probability-density
en.wikipedia.org/wiki/German_tank_problem
twitter.com/SFWRedditVideos

Don't. Jews wouldn't have 1000$ as a consultation prize.

This isn't even game theory. It's a literal 50/50, just because you know how much money is in one box, doesn't mean you know how much is in the other. Just the probability (50%) that it is more than yours. Flip a coin if you must, but there is no "should" as there isn't a higher probability for one outcome over the other.

if you swap the expected value is (10000+100)/2 = 5050, so yes

you get $1000 if you stay and $5050 in average if you swap

The average of money in the boxes in no way, shape or form dictates the the probability of choosing the right ones.

what is probability that box B has $100?
50%
what is probability that box B has $10000?
50%

if you swap you end up with $100 or $10000 with 50% probability, therefore you should swap to maximize the expected value.

You can't maximize the expected value, because you have 0 reason to expect one result over the other.

I agree this is the right answer, but why shouldn't the EV be $4050 since we've already have $1000 (either we gain $9000 or lose $900)

this
/thread

Interestingly, if you go a step back (before opening box A), you can do better then a 50% winning chance:

Chose a number at random, [math] X [/math]. If you then pick a box, keep it if [math] X [/math] is smaller then the amount in the box.
There are 3 cases:
I) [math] X [/math] is smaller then [math] A [/math] and [math] B [/math], with probability [math] P = x_1 [/math]
II) [math] X [/math] lies between [math] A [/math] and [math] B [/math], with [math] P = x_2 [/math]
III) [math] X [/math] is bigger then [math] A [/math] and [math] B [/math], with [math] P = x_3 [/math]

In case i) and III), it is a 50% to pick the better box, so your "win" probability would be
[math] \frac{x_1 + x_3}{2} + x_2 [/math]
and since [math] x_1 + x_2 + x_3 = 1[/math]
[math] \frac{1}{2} + \frac{x_2}{2} [/math]
Because [math] x_2 > 0 [/math] you get a better then 50% probability.

You accept a bet if the possible gain times probability of winning exceeds possible loss times probability of losing.
If you take box B there's a 50% chance of gaining $9000 vs. a 50% chance of losing $900. That says you should switch.

Something wrong with the logic here since the same would apply if you'd chosen box B first.
After all, if one person chose A and another person chose B, they can't BOTH have a winning strategy by switching.

It's a good problem and I hope someone can explain the correct solution clearly.

It's a well known problem you'll find a lot in the web:
en.wikipedia.org/wiki/Two_envelopes_problem
brilliant.org/wiki/two-envelope-paradox/
mathoverflow.net/questions/9037/how-is-it-that-you-can-guess-if-one-of-a-pair-of-random-numbers-is-larger-with

Yes. Risk reward says the choice is always profitable.

>game theory goyim believe that loss and gain have equal weight

I posted but recognized something was wrong with my logic.
So I read ps://en.wikipedia.org/wiki/Two_envelopes_problem
particularly the "simple solution" section.
There, the envelopes contain 1 and 2 dollars.
If you have 1 and switch, you gain 1
If you have 2 and switch, you lose 1
Since the expected gains and losses are equal, there's no point in switching. That's clear.
But here we have a TEN to one ratio. If your envelope contains 10, then you could gain 90 or lose 9 by switching. Switching again seems advantageous.
So your links really aren't clearing this up for me.

Being in this situation, I know for sure no one would have put 10k as a prize for a random stranger like me. I pick my 1k and get the fuck outta here

>$5050 in average
Literally lies, damned lies and statistics.

>But here we have a TEN to one ratio.
The ratio makes no difference:

>If you have 1 and switch, you gain 1
>If you have 2 and switch, you lose 1
If you have 1000 and switch, you gain 9000
If you have 10000 and switch, you lose 9000

I see what you're saying but it still seem asymmetric.
Your numbers are right but you don't know if you have the larger or smaller amount. You just know you have X and the possibility of gaining 9X or losing 0.9X
I'm afraid I still don't understand the flaw -- though your example helps.

The problem is taking it unconditional, when it is conditional.
S: "switch", H: "high amount", L: "low amount". The values in the boxes: x and kx, then:
[math]
E(S) = E( S | H) \cdot P(H) + E( S | L) \cdot P(L) \\
E(S) = -(k-1)x \cdot \frac{1}{2} + (k-1)x \cdot \frac{1}{2} \\
E(S) = 0
[/math]

This isn't the two envelopes problem because you open the box first. The problem comes down to the probability distribution of the money in the boxes. Assuming that there's a 50/50 chance of gaining/losing money means assuming it's equally likely that the boxes contain $100 and $1000 or $1000 and $10000. But there's no reason to think this. Since there's no such thing as a uniform distribution across Z, Q or R we know that some amounts of money are more probable than others.

Answering this question is like answering "if I flip an unfair coin what's the chance it lands heads?"

>This isn't the two envelopes problem because you open the box first
Yes, that's true, but open the first box doesn't give you knew information. So with respect to expectation value, nothing changes.

Yes. Expected value of the other box is 10k(.5) +100(.5) = 5050. That is more than your box's value.

Of course it does. Not all values of money in the box are equally probable. There is no such thing as a uniform distribution over Z, Q or R. It is impossible that for every value of box A, P(B=A/10) = P(B=10A). So for some values of A, the odds of B having more money must be higher or lower than 50%.

The reason the two envelopes problem works is that for any possible probability distribution of money, changing envelopes doesn't matter, so the problem doesn't need to give the distribution. If you open the box and check the money, suddenly the distribution matters.

E(Xb) = Sum x(Px) = (100)(.5) + (10000)(.5) = 10100 / 2 = 5050

E(Xa) = Sum x(Px) = (1000)(1)

E(Xb) > E(Xa) Therefore I should switch boxes

Sure, why not, $100 and $1000 don't really seem that different, but $1000 and $10,000 is a pretty big gap.

>Each box has some unknown amount of money
This is not a well-defined distribution, hence the apparent paradox.

2/3

Seems to me the amount in the box you open only matters if you've some idea of the upper bound. Otherwise, what do you compare the known figure with? Instead of real money, suppose the boxes merely have random integers written inside. If the numbers are truly random it makes sense to trade, Box B could have a number higher or lower, but there are infinitely many higher numbers and only a finite set of lower numbers. So the other box is more likely to hold a higher number.
Whereas if there's a known upper bound, say 1e9, you only need check if you're above 5e8.

To return to the original problem, you open the envelope and it contains N money. The other envelope has either 10N or 0.1N. Does it matter if N = $1000 or N= 20 cents? If not, what has opening the envelope told you?

this

>If the numbers are truly random it makes sense to trade, Box B could have a number higher or lower, but there are infinitely many higher numbers and only a finite set of lower numbers.
Once again, THERE IS NO SUCH THING AS A UNIFORM DISTRIBUTION OVER Z. There doesn't need to be an upper bound, but each integer needs a finite probability and they need to sum to 1. So let's say it's a uniform distribution up to some upper bound, but we don't know what the bound is. Not all upper bounds can be equally probable, because that would be a uniform distribution over Z.

Knowing N gives you information about whether 10N or .1N is more likely. IT IS NOT POSSIBLE FOR THEM TO BE EQUALLY LIKELY FOR ALL N.

[Not that guy]

The point about uniform distributions over Z is well taken. But I don't think it resolves the paradox.

Let's say I have a nonuniform prior about the total amount M of money in the two boxes, and I observe an amount N in box A. That prior is probably logarithmic, such that the relative probability P(M = 1.1 N) / P(M = 11 N) is constant for all values of N; or close to logarithmic, having an approximation of that same property. Assuming my valuation of money is similarly regular -- we can just assume they are utilons, and avoid that complication -- then either I will always choose to switch after seeing the value N, independent of N; or I will always stick with what I have. Which means the value of N relative to my prior does not actually play a role in resolving the paradox.

I only need $1000, $100 doesn't help me so I will take box a.

[Not that guy either]

The crux of the paradox is that there cannot exist a uniform probability distribution with Archimedean support (which includes N, Z, Q, R, etc.). It's an algebraic problem that strikes all theories of probability at their foundational levels, so changing the semantics (e.g. to a Bayesian interpretation) isn't going to help.

More concretely, the paradox remains in Bayesian semantics because any nonuniform prior (e.g. logarithmic over N) can be pulled back to a uniform one (uniform over log N).

I'm pretty sure that distribution doesn't exist.

Let X be the amount of money in whichever box has less. Then P(X=x)/P(X=10x) = 1/a, where a is the constant rate of decrease. Then take the integral of the PDF from 1 to 10 and call that b. By a change of variables we can show that the integral from 10 to 100 is equal to 10*a*b, the integral from 100 to 1000 is equal to 100*a^2*b, and so on. This can only converge if a < 1/10. But the integral from .1 to 1 is equal to b/(10*a), the integral from .01 to .1 is equal to b/(100*a^2), and so on. This can only converge if a > 1/10. Thus a contradiction.

Actually I didn't need all that infinite series crap, the distribution with the property you described is 1/x, its CDF is log(x) and it obviously doesn't converge.

>consultation prize

nice malapropism

>you should swap to maximize the expected value.

why does this matter when the 'expected value' is a function of long-term probability, when presumably you get to play this game once? If you say I can play the game 50 times, then sure, I'll swap every time.But because we get one shot, expected value is less relevant.

there is no mathematical 'Gotcha!' answer to this question. It's whether you prefer to risk a month's rent for a month-long vacation, with the worst case scenario being enough money for groceries.

Ask yourself this: would you rather swap for a box that either has $2000 or $0?

Cost of opportunity will tell me to swap. But I need $1,000 right now so fuck it, I can't be rational when I'm at risk of getting evicted.

And this is why microeconomics is fucked.

sometimes you just gotta risk it for the biscuit

This. The prize has no effect on probable outcome.

Why is everyone assuming that it's equally likely that B will be 10x larger and 10x smaller? That's given nowhere in the actual question.

Why would it not be? It's 50/50.

You have literally no reason to think it is.

>one of the boxes has 10 times more money than the other.
the other box either has 1/10th or 10 times the money and this status is entirely dependent on which box you picked at first. What reasons wouldn't there be a 50/50 chance in that choice? Is box A uglier than box B?

I have literally every reason to believe it is. The prize in the box does not change the probable outcome. You pick box A. There is 50% chance you get 10x larger and a 50% chance you get an amount 10x smaller. It is literally 50/50. You have provided 0 arguments as to why it would be otherwise.

This isn't quite right, maximizing expected value is the objectively correct choice. Always. The phenomenon you are describing where people prefer less risk is because in the real world people do not consider the value of money to scale proportionally with the quantity, ie. 10 times more money is not 10 times more preferable. This makes sense, if I have 0 dollars to my name 100 dollars means a lot more than 100 dollars to a millionaire. Due to the marginal value of money, people prefer less risky ventures, but this is not going against the principle of maximizing expected value, simply using a less naive formulation of valuable than the dollar amount.

You've still yet to prove why switching your choice gives you a higher probability of getting more money. You don't have an expected value. There is no reason to expect your prize is $10,000. You have a literally equal chance. You should expect to get $100 just as much as you should expect to get $10,000. There is absolutely 0 change in probability once you know the amount in your box. You have an equal expectation to maximize your value, as you do to minimize it.

That what expected value means. This is very basic probability, if I have two options, with 50/50 probability one giving me $100 and one giving $10000 my expected value is .5 * 100 + .5 * 10000 = $5050. This is the average result I can expected from this choice and clearly more than $1000 under a naive formulation. If you arguing that it is not 50/50, perhaps, but the question is ambiguous in that regard.

>You don't know the distribution
>therefore it's 50/50
Jesus Christ, is this actual ignorance or just weak Veeky Forums trolling?

I do know the distribution. It's $100, $1,000 or $10,000.

I guess trolling was the answer

I'm not trolling.

has this ever actually convinced anyone that you aren't a troll? If you truly aren't trolling please try and learn something about a subject before you talk about it with such confidence.

I don't particularly care if you're convinced.

I would swap due to minimal opportunity cost.

Say I swap and its only $100.

Then I only really lost $900.

Say I stay and its $10,000. Then I lost say $9000.

The losses and gains are far greater for not going with the win.

There is no distribution such that the probability of B being larger is independent of the amount of money in A. I already demonstrated ( ) that such a distribution can't exist. It may be 50/50 for some N, but not all, so it's stupid to assume it's 50/50 for $1000. The question is unanswerable.

Bump

1000$ is already enough, no need to get greedy.

This

Both $100 and $1000 are peanuts

$10,000 is a down payment on a house

[I am that guy]
No it does not! Jesus, do you really think you have a bigger EV by switching? We don't have to consider the probability distribution over the underlying set but over "box with high amount" vs. "box with low amount", which stays the same, 1/2. And the EV is still conditional. If you still don't agree, show me a concrete situation.

the only correct answer

For every possible distribution, once you open the box you know whether or not you should switch based on how much money is inside.

Let X be the amount of money (arbitrary units) in whichever box has less. Let's say our distribution is P(X=x) = 2 - 2x when 0 < x

Why would you assume anything about the distribution of monetary rewards. You have no information, it's a flat prior.

>You have no information
yes
>it's a flat prior
does not follow

If you have absolutely no information about what the distribution is, you can't just assume it's uniform. You have no reason to believe that's more likely than literally any other distribution.

>You have no information
>it's a flat prior.
so which one is it?

kys brainlet scum

Because it's the maximally entropic distribution that fits the constraints.

Some people have protested that the OP's question really isn't the two-envelope problem because you've gained additional information by opening envelope A and seeing the amount within. But no one seems willing or able to answer how that helps them make a decision.

Let's simplify.
You're led into a room with 100 pairs of boxes. Within each pair, one contains 10 times as much money as the other but which is which has been randomly determined. If you believe the experimenter, it doesn't make a difference which you open so you might as well always choose the left-hand box.
You open Left 1 and there's $100 inside. Have you really learned anything about Right 1? No. Might as well keep the $100 and move on.
Left 2 contains $100
Left 3 contains $100
Left 4 contains $10
Left 5 contains $10
Left 6 contains &100
You're starting to see the pattern. You still have no proof that all boxes contain either $10 or $100 but, from this point onwards, if would make sense to switch if the left box (the one you opened) contains $10.

Opening one box tells you nothing about the likely contents of the other box! Is $100 a lot of money or is it a little? So it IS the two-envelope problem. Only when you've gained experience about the probable sums can you start to make decisions on a better-than-chance basis.

Thanks to the posters who helped me wrap my head around this.

If you choose a uniform distribution you need to specify an upper limit (as I've said over and over and over again it can't be all R). I'll call this b and assume the lower limit is 0. Then there's a 50% chance you pick the small box and there's less than b in it, a 5% chance you pick the large box and there's less than b in it, and a 45% chance you pick the large box and there's more than b in it. The cases where there's the most money in the box you opened are the cases where you're most likely to lose, which is why the expected value of switching is 0.

You're assuming a distribution where the one sample you have is anomalously small, which is stupid.

The reason the envelope problem works is that the utility of switching is 0 for every possible distribution, so you don't need to know what the distribution is. This problem is unanswerable because the utility of switching is positive for some distributions and negative for others, and we have no information about which case it is.

Also
>from this point onwards, if would make sense to switch if the left box (the one you opened) contains $10
is irreconcilable with
>Opening one box tells you nothing about the likely contents of the other box!

Yes, 100 or 1000 dollars will very rarely impact your life significantly but 10000 will go a long way.
And
Yes, you have a 50% chance to loose 900 and a 50% chance of gaining 9000 so the way i see it its a statisticaly net positive of 8100

A computer is capable of accuratey predicting any future event. You are presented with two boxes and may take both of the boxes with you, or take one and leave the other. Box A contains $1,000. Box B will contain $10,000 if the computer predicts you will pick take only Box B, and $0 if the computer predicts you will take both boxes. What do you do?

>choose a number at random
Okie dokie!
128371928371928371928371298371928371298371928371298371298371923719823719287318927391827389127398127398127392749126519561982471283727391283791283792837918237912837928739174916591561947194719283712983718923719283719371923...

I meant opening one box of the FIRST pair tells you nothing new.
Opening one box of the Nth pair tells you something.
The knowledge doesn't come suddenly with any single box. It grows the more you open.
So the statements are not irreconcilable.

That's another problem entirely and depends on whether you believe a computer (or God) can accurately foretell the future.
There are arguments about your best course of action and, SFAIK, neither side is ready to concede.
This problem (and others) lead the to conclusion that it is impossible to have something which can absolutely predict the future -- AND tell you about it. This way lies infinite regress.
The problem begins with a false premise, that such an entity could exist.

Hah, brainlets. I've watched numberphile, I know how to solve this. I find a place with three doors, and put the two boxes behind two of the doors, then I knock myself out.

Is this troll statistics

>You're assuming a distribution where the one sample you have is anomalously small, which is stupid.
This argument doesn't hold water. No matter what number you choose, there's infinitely more probability mass to the right of it on the number line. Every number is anomalously small.

>No matter what number you choose, there's infinitely more probability mass to the right of it on the number line.
NO! THERE'S NO SUCH THING AS A UNIFORM DISTRIBUTION OVER R!

E(B) = 10 * .5 + 10,000 * .5 = 5,005

E(A) = 1,000

E(B) >E(A)

That's a failing of the underlying theory.

Also:
math.stackexchange.com/questions/929702/hyperreal-probability-density

The fuck are you on about? The distribution you're trying to describe is nonsensical.

So what, you chose a big number,... bravo?

You are right with your repeating "no uniform distribution over IR", But I have to repeat myself too: You don't know the distribution. And more importantly, by knowing box A contains 1000 you gain no information about the distribution. So what matters is whether you have taken the box with the high value or the one with the low value - 50:50.

>So what matters is whether you have taken the box with the high value or the one with the low value - 50:50.
Not knowing a distribution does not make it 50/50, it makes it unknown.
>And more importantly, by knowing box A contains 1000 you gain no information about the distribution.
Of course sampling a random variable gives you information about its distribution. Even a single sample of a uniform distribution gives you an informed guess at its upper bound.
en.wikipedia.org/wiki/German_tank_problem

the other box could have either $10000 or $100 but it's 50/50
10000-1000>|100-1000|