You are given 2 boxes A and B. Each box has some unknown amount of money...

Interestingly, if you go a step back (before opening box A), you can do better then a 50% winning chance:

Chose a number at random, [math] X [/math]. If you then pick a box, keep it if [math] X [/math] is smaller then the amount in the box.
There are 3 cases:
I) [math] X [/math] is smaller then [math] A [/math] and [math] B [/math], with probability [math] P = x_1 [/math]
II) [math] X [/math] lies between [math] A [/math] and [math] B [/math], with [math] P = x_2 [/math]
III) [math] X [/math] is bigger then [math] A [/math] and [math] B [/math], with [math] P = x_3 [/math]

In case i) and III), it is a 50% to pick the better box, so your "win" probability would be
[math] \frac{x_1 + x_3}{2} + x_2 [/math]
and since [math] x_1 + x_2 + x_3 = 1[/math]
[math] \frac{1}{2} + \frac{x_2}{2} [/math]
Because [math] x_2 > 0 [/math] you get a better then 50% probability.

You accept a bet if the possible gain times probability of winning exceeds possible loss times probability of losing.
If you take box B there's a 50% chance of gaining $9000 vs. a 50% chance of losing $900. That says you should switch.

Something wrong with the logic here since the same would apply if you'd chosen box B first.
After all, if one person chose A and another person chose B, they can't BOTH have a winning strategy by switching.

It's a good problem and I hope someone can explain the correct solution clearly.

It's a well known problem you'll find a lot in the web:
en.wikipedia.org/wiki/Two_envelopes_problem
brilliant.org/wiki/two-envelope-paradox/
mathoverflow.net/questions/9037/how-is-it-that-you-can-guess-if-one-of-a-pair-of-random-numbers-is-larger-with

Yes. Risk reward says the choice is always profitable.

>game theory goyim believe that loss and gain have equal weight

I posted but recognized something was wrong with my logic.
So I read ps://en.wikipedia.org/wiki/Two_envelopes_problem
particularly the "simple solution" section.
There, the envelopes contain 1 and 2 dollars.
If you have 1 and switch, you gain 1
If you have 2 and switch, you lose 1
Since the expected gains and losses are equal, there's no point in switching. That's clear.
But here we have a TEN to one ratio. If your envelope contains 10, then you could gain 90 or lose 9 by switching. Switching again seems advantageous.
So your links really aren't clearing this up for me.

Being in this situation, I know for sure no one would have put 10k as a prize for a random stranger like me. I pick my 1k and get the fuck outta here

>$5050 in average
Literally lies, damned lies and statistics.

>But here we have a TEN to one ratio.
The ratio makes no difference:

>If you have 1 and switch, you gain 1
>If you have 2 and switch, you lose 1
If you have 1000 and switch, you gain 9000
If you have 10000 and switch, you lose 9000

I see what you're saying but it still seem asymmetric.
Your numbers are right but you don't know if you have the larger or smaller amount. You just know you have X and the possibility of gaining 9X or losing 0.9X
I'm afraid I still don't understand the flaw -- though your example helps.