Problem time?

Here is a simple problem. It's half maths and half strategy, I can't fully get my head around it and I figure that there is an optimal solution that will beat other solutions.

>You are given 10,000 dollars to gamble.
>The odds of what you must bet on fluctuate between $2 and a max of $10 and will average $2.5.
>You do not know the exact odds before you bet.
>The win rate for what you are betting on is 45%
>You can only play 2000 games.

Unlike any normal gambling the odds here are in your favor, you just have to develop a strategy to maximize your return. How would you go about it?

I'd like to see what Veeky Forums thinks here or if there exists a program to work something like this out. I've been feeling fairly stupid as my strategy would be simply doubling down with my initial bet being recalculated after each win to able to cover a loss streak of 11.
So -
(initial bet)*2^0+(initial bet)*2^1+(initial bet)*2^2...+(initial bet)*2^11)

Other urls found in this thread:

my.mixtape.moe/lpcvzt.nb
en.wikipedia.org/wiki/Multi-armed_bandit
en.wikipedia.org/wiki/Thompson_sampling
stats.stackexchange.com/questions/23382/best-bandit-algorithm
pastebin.com/UuKuL6X1
twitter.com/NSFWRedditVideo

Fuck that I'm buying a carton of cigs, lottery tickets, a couple tall boys and some porno mags. I'll put the rest under my dresser.

>(initial bet)*2^0+(initial bet)*2^1+(initial bet)*2^2...+(initial bet)*2^11)(initial bet)*2^0+(initial bet)*2^1+(initial bet)*2^2...+(initial bet)*2^7)

Can you clarify the part about the odds? Why do the odds have a dollar amount? Is that a multiplier on the bet if you win? If so, I doubt any house will give a 10x payout on a coinflip

> Is that a multiplier on the bet if you win?
Yes it is, It's the payout.
If you put down a $1 and the odds are $2.5 you get a $2.5 payout and profit $1.5.

>If so, I doubt any house will give a 10x payout on a coinflip
It is a hypothetical scenario and more interesting as classic gambling strategies are pretty flawed whereas this is actually about making money as efficiently as possible while reducing risk.
At an average of 2.5 a $10 payout would be pretty very unlikely but additionally it's potential would favour more risk taking strategies no?

look up kelly criterion and modify it slightly for your problem.

Is there a pattern to the payout of one game and the next? If one game pays 10/1 is the next likely to also be high? If so, you want to make smaller bets until you recognize high payouts, then start betting high until the payouts drop.

The problem with this setup is, most games are in your favor. If there's no pattern to the payouts and your goal is to maximize your expected profits, the best strategy is to bet almost all your money on every game. This gives you a 5E-3001 chance of ending with any money, but if you do it's an unimaginable amount.

This is dumb. I care about my first million more than my second more than my third. So instead of maximizing my expected dollars I'll try maximizing the expected square root of my dollars. This will prioritize high-probability medium payouts over low-probability large payouts. The new problem is, with the non-linear utility function you can't make proper calculations your expected gain without knowing the distribution of the odds, so you need to piece that together as you go.

>I'll try maximizing the expected square root of my dollars.

I say you should maximise the expected value with the constraint that the probability of you making a loss is less than x%
Or with the constraint that the probability of you making less than say 100G is less than x%

where you decide the x %

I wouldnt play at all, and invest in Crypto. After the pump i would dump before the whales and buy a ferrari. The only way to win is to cheat user. Get With the times

Extrapolate exact odds from win rate.
Profit.

OP here
I'm looking this up and trying to get my head around it. Thanks for pointing me in that direction however, it does seem relevant.

Assume there is no pattern at all to the payout but that it is highly random.
It seems like you'd almost use it as an opportunity to play the lottery? Perhaps In the problem I should have included more incentive not to lose, like you would die or you would have to pay the difference...

If some strats were side by side we could see the varying payouts vs risk and determine the objectively best outcome... Which could be yours...
You also must assume that the chance of a win is constant 45%

I like the way you are approaching this.
If after 2000 games you wanted the probability of walking away with less than what you started with to be under 1% how would you play?

10,000 bitcoin to gamble then ;) With odds like this it is cheating.

>The win rate for what you are betting on is 45%

This line alone invalidates all strategies. Just bet on $10 everytime (for a $1) and you will make (($10 x 2,000 games) x 0.45% winrate) - ($1 x 2,000 games) = $7,000 profit.

>Just bet on $10 everytime (for a $1) and you will make (($10 x 2,000 games) x 0.45% winrate) - ($1 x 2,000 games) = $7,000 profit.

This is wrong though.
Purely doubling down from an initial bet of approximately 20$ will leave you with an extremely low probability of going bust and will generally return above $35,0000. I'm purely modeling from self created data sets and have not made a formula of this approaching this in a trial error fashion.

I think additionally the maths you are using is definitely off.
Wouldn't it be something more like?
=((10*2000)*0.45)*2.5-(10*2000) which gives $2500
I don't actually know where or how you derived your formula... $7000 matches up with an overall win total but does not take into account the losses....

Why not bet $100 though and get 10x the return? Why 10$?

The kelly criterion may be the best option of betting a proportion of the bank each time. However I have not yet been able to decide how well this stacks up when future odds are variable.

If you allowed yourself the risk of going completely bust to be 1% over all the games there could be some very good return strategies.

>This line alone invalidates all strategies.
It is essentially 2.5 as a return on a win. If the probability was above 50% how would that change your approach?

Just bet 1/12 of your money each time. Expected value is $10,023,109,001,700.68

Please explain this problem better. So I can bet any amount on this board and have a 45% chance of winning a payout that overtime will average 2.5x?

So hypothetically I could give you $x and 55 out of 100 times you would return nothing and 45 out of 100 times you would return a payout and most likely after a billion games of play Id see you were returning 2.5x for the 45% of games I was winning?

>So hypothetically I could give you $x and 55 out of 100 times you would return nothing and 45 out of 100 times you would return a payout and most likely after a billion games of play Id see you were returning 2.5x for the 45% of games I was winning?

Yes that is correct.
Additionally there would never be a return below 2x if you won.

There is the constraint of only having $10,000 to start with and a limit of 2000 games.

There is no bet limit.

There is a (0.55*0.55) = 30.25% chance I lose 2 games in a row or a 69.75% chance I dont lose 2 games in a row.

Of course you can bet based on luck but betting 50% of my current pot should net the most profit since any win covers any loss. Ill write a program in vba tomorrow that demonstrates this.

Surely the most plus EV line is to max bet on every bet. I understand you are trying to think about future game to reserve your bankroll but your future game is also increased by the bigger win in the first round and then the same logic for the second bet etc.

Always bet 45% of your money IDK why but it just feels right.

Just messing around with some simulations of betting strategies involving betting the same amount every time.

If I were to try adaptive strategies I'd probably first experiment more to build a good estimate for the distribution of what betting amounts tend to lead to death from what starting point, then I'd start high and if I got below some threshold, I'd look up the optimal amount based on that.

Mathematica file if anyone's interested:
my.mixtape.moe/lpcvzt.nb

I GIVE UP FOR THE NIGHT.

Either my program is wrong or basically there is no advantageous strategy here.
As long as you bet a % of your current wallet size (so 1% to 99%) you will always average
a decent amount of money.

Prove me wrong and I'll tip you $5 LTC.

Hahahaha,

I'm so dumb. Fixed my stuff though. See attached for the solution.

>You do not know the exact odds before you bet.
en.wikipedia.org/wiki/Multi-armed_bandit
en.wikipedia.org/wiki/Thompson_sampling
stats.stackexchange.com/questions/23382/best-bandit-algorithm

Wow, thanks user. Really interesting stuff. I'm surprised to see how much of a stand out 28% is if I am reading that right.
Unreal stuff.
Are you using matlab or something?

Cool gif or the simple bets.

Its VBA in Excel. Type in the code and anyone can play with it. If i have time today ill play around a bit more and maybe run a much larger sample size.

Yes 28% seems to be the key betting size using this method and frankly im not smart enough to think of a better strategy.

So to reiterate what the graph shows. I ran 200 games of 2000 "dice rolls" or whatever it is im betting on for 1%, 2%, 3%, ... , up to 99% of whatever my current wallet size is (starting at 10k).

The key factor I believe is you on average win 2.5x so i just assumed that out right. If you define the actual probabilities for each winning i could get you a more accurate chart.

I know I keep spamming this thread with data when I should just collect my thoughts into one post but before I head to work I figured I'd check if the data would fit a logarithmic graph and-do-you-know it did.

Ok, but not I really have to go to work.

Seems like low bets are more stable than high bets, makes sense. The real question is how we determine the Goldilocks bet value for any win % and average odds.
Obviously the higher the win % the higher the bet value should be. You are better off betting high on 99% chance than 1% chance.
The number of games directly effects the maximum earnings but not your choices.
The only question is how the average odds effects how much you should bet.
Does changing the odds change the ideal bet or only the maximum gains?

Finally we would want an equation that can tell us where to bet for any given input values.

If 2.5 WILL be the average at the end of 2000 games, I'd say look at the average of your previous bets: if it's higher than 2.5 bet low, if it's lower, bet high. If it's exactly 2.5, bet 5.
Every bet proportioned to both the average so far and the playable games remaining, so you get a more accurate probability.

Sauce? I searched everywhere and can't find it.

Just realized that supposes that you only use the starting 10k bucks, without regambling your earnings

Is this correct?

This is gamblers fallacy. Past odds dont change future odds unless OP explicitly states so (which he doesnt).

If 2.5 will be for sure the average, then past odds do change future odds.

There is a difference between odds and chance. Or chance and likelihood so to speak. The difference is in scale and perspective. The "chance" of flipping a coin 1000 times and getting heads every time is not 50%. The chance of flipping a coin one time and getting heads is. Scale and perspective.

Wait is this saying that betting 28% of your current money is the best strategy? Because for that strategy I'm getting that you bottom-out virtually 100% of the time.

What distribution are you using to decide the outcome of the current bet? Based on the OP, the most you can assume is that you get a win 45% of the time, and that a win doubles your money 93.75% of the time and dectuples it 6.25% of the time.

That's like saying if you're betting on a fair coin flip you should count how many heads you've flipped previously and compare it to how many tails you've flipped previously and go with the one that has less.

Stop being retarded.

It assumes
(1) 28% of current money is bet
(2) there is a 45% chance of winning
(3) a win nets 2.5x profit

Because OP didnt specify the distributions from 2x to 10x and just said 2.5x on average thats the best I could do. Tonight Ill run simulations with different win percentages than 45% and see what happens.

Thoughts? I posted my code so please point out the flaws in the code and Ill tip you some LTC.

Even with that model I'm getting the probability of not dying as less than 1%. Removing the minimum currency denomination of $0.01 (that is, making the assumption that you can split a dollar indefinitely) only gives a probability of coming out with more than a cent as 1.3%.

Couldn't say what's wrong with your code.

Well, that was my reasoning. Because op stated in the second rule that the odds WILL AVERAGE 2.5, meaning (as I interpreted it) that if you take any possible game and calculate the average of all the 2000 multipliers of each gamble, it will average exactly 2.5
I guess it comes down to semantics, I might have misunderstood.

Understandable. OP probably doesn't know enough about probability to know that he should specify iid draws from some outcome distribution, but it's a pretty safe thing to assume.

A fair coin on average will land heads 50% of the time. That doesn't mean exactly 50% of 2000 flips will be heads.

I made a brokeboi python program a while back to find the optimal staking % using a fixed % staking strategy given odds and probability. It essentially calculates the kelly optimal using simulations rather than maths so look into that if you want to know more about the maths involved. This was done assuming the odds distribution doesn't have a significant effect on the outcome.

Martingale definitely isn't the strategy you want to use here, since it has a non-zero chance of busting which becomes inevitable in an infinite time frame.

this would be true for a single wager, but busting would preclude you from being able to take part in future profitable opportunities, making it the worst strategy here ironically.

>I made a brokeboi python program a while back to find the optimal staking % using a fixed % staking strategy given odds and probability. It essentially calculates the kelly optimal using simulations rather than maths so look into that if you want to know more about the maths involved. This was done assuming the odds distribution doesn't have a significant effect on the outcome.

Sounds pretty interesting actually... Do you have a link to this?

Lets do a thought experiment involving 2 games played only. Im assuming our rules we imagine for this game our different.

So 2 games - outcomes are 00, 01, 10, and 11.

Starting at $10k the odds of (00) are (0.55*0.55)=30.25%
The odds of (01) are (0.55*0.45) = 24.75%
The odds of (10) are (0.45*0.55) = 24.75%
The odds of (11) are (0.45*0.45) = 20.25%

And the payouts are respectively 5184, 12,240, 12,240, and 28900. So nearly 70% of the time you end up with positive money. If you extend this to any number of games im having trouble seeing how you tend towards zero.

pastebin.com/UuKuL6X1
here's the code
its very simple but it may take a long time to run if you want good results

You're assuming that when you win you get your bet back IN ADDITION TO 2.5x the amount that you bet. I don't think that's right. According to the other way the possibilities are $5184, $10224 (twice), and $20164 with those same probabilities making the average outcome $10712.30. My code is in agreement with this result and yet it still says you'd die in the long run.

But I tried it your way in pic related and as you can see the results after two bets is as you've said and yet you STILL die in the long run with very high probability so something about the way you're generalizing your argument to longer time-scales is wrong.

I wish I could read Matlab code. Maybe the vba rand() function is wrong. Ill think about it some more and post in 9 hours.

It's Mathematica you dolt.

if you have positive expected value, just bet maximum possible amount ($10000?)

this strategy will give you maximum possible return.

Not necessarily, when you lose then you lose all of your accumulated bankroll.

You want to bet an amount that doesn't risk the bank roll. That is why it is going to be some percent of accumulated earnings.

The interesting thing would be developing some kind of formula for this I suppose. The Kelly Criteria says 1/12 but this thread has been saying 28%