Stats/Probability Thread

Anybody here do statistics or statistical mechanics?

I have a theory question if so. It seems so simple I can't believe the professor put it there, so I'm thinking it's a trick question.

> If the moment generating function of the distribution exists, then the Method of
Moments estimator always exists
Why would it ever not exist? It's just an estimator of the moment. You get it from the MGF.

General stats and probability thread!

cauchy distributions have no mgf, E[X] or Var[X] my nigga

stats is a chad field, thus Veeky Forums need not apply

Motherfucking cauchy dists...

That's only because beta statisticians define it that in order to hide the honest truth so that they can claim that the law of large numbers always holds.

it would be far more honest of them to acknowledge that the expected value of a probability density function f(x) is the x coordinate of the centre of mass of that function if it were a uniform lamina, i.e. if it''s area were actually a thin volume of uniform density.
Any honest person should be able to see and acknowledge that the centre of mass of the cauchy distribution would be it's centre because it is symmetrical and symmetrical objects of uniform density always balance on top of their line of symmetry and the x coordinate of the centre of mass of an object is always directly above where an object balances.

however probabilitycucks and statscucks don't want to acknowledge this obvious truth because they want to dishonestly claim that the law of large numbers always holds by discounting all the counter examples like the cauchy distribution.

The expected value of the cauchy distribution, as with any symmetrical distribution, is obviously at it's line of symmetry, in this case 0.
proof : the probability that you get positive x, f(x) is always equal to the probability that you get negative x, f(-x) so for all x x*f(x) + -xf(-x) = 0 so contributions of all these parts will always cancel out and will always equal 0.

the expectation of the cauchy distribution IS EQUAL to 0, the so-called ""law"" of large numbers simply doesn't always work and isn't a real law at all.
This is what dishonest probability cucks want to hide from you.

No statistician claims the law of large numbers always holds, but it can be shown to hold for twice-integrable functions.
And if you think your definition of expected value is superior, go ahead, define it that way and see if anything interesting follows from that.

Personally I find honesty interesting and its own reward.

the law of large numbers can suck my dick because it is obvious that the cauchy distribution does have a mean of 0 , the probcuck and statcuck establishment are just pretending it doesn't because they're a cabal of liars and charlatans polluting the public space and trying to indoctrinate the public with their lying dogma.

ok bud, whatever you say
looking forward to your groundbreaking publications

>tfw stochastic processes right now
>tfw next course is probability II with the book An Intermediate Course in Probability by Allan Gut

>it is obvious that the cauchy distribution does have a mean of 0
Just simulate it bro

the """""law"""""" of large numbers is not true BECAUSE it doesn't work for the cauchy distribution and some others.

you sound like terry davis my nigga
the laws of large numbers, weak or strong, dont make any statement with complete certainty which implies counter examples are sure to be found
this guy gets it terry davis above should simulate ie write a random number generator using a cauchy distribution and average the results let me know how your experiment ends lmao

JUST
U
S
T

>tfw statistchad and probabilachad at the same time

Okay so I'm about to have my first stats midterm. Please help me clear up any misunderstandings I have.

Hypergeometric cases are for any given super population N, local population N, and population of interest r, it will be
[math]\frac{rCn\times (N-r)C(n-x)}{NCn}[/math]
This is for the pmf. The cdf will just be the sum of all the elements in your sample space.

The Bernoulli is just the binomial with n=1.

The binomial is for any given population of interest choose however many you want and multiply it by probabilities raised to their respective powers (x and n-x)

The mean is just second value of the mgf or the second derivative of mgf evaluated at 0. The expected value is linear. The variance is the difference between the argument of the expected value squared and the expected value squared. A fair game has 0 variance

Not sure on that mgf statement

What is the expected value of a bimodal distribution with a heavy tail then?
checkmate

>The Bernoulli is just the binomial with n=1.
The binomial is a sum of indepedent bernoulli

you don't understand. the law of large numbers says that if you roll a random variable a large number of times, add all those results together and divide by the number of rolls then the resulting number will tend to the expected value of the probability density function of that random variable.

This is not true for the cauchy distribution. it is a counter example.
So yes obviously you can't simulate the expected value of the cauchy distribution that way because the cauchy distirbution is a counter example to the law of large numbers .

This does not change the fact that the cauchy distribution obviously does have a mean because it is symmetrical and so the expected value is its line of symmetry.
every symmetrical distribution has its centre of mass, it's expected value at its line of symmetry.

Like I said before, if it's symmetrical then its expected value is at that line of symmetry. if not symmetrical then I don't care.

I just started a prob & stats course. The recommended book is the one by Ross, but I had a gander at it and I hated the layout, plus I noticed quite a few errors.
I decided to pick up DeGroot just the other day and it's a lot better, I appreciate all the detailed explanations. Are there any equal or better books I should check out?

is stats a good major? what kind of job would i have?

Redpill me on Bayes' theorem.

Best decision I’ve ever made, but you’ll need a masters to get a really good job. I work 9-5 for a southern bank (US), developing credit and fraud models (logistic regression, neural nets, gradient boosted trees, etc.) and make 130k/year.

Take a few CS and ML courses if you can. Once you get out in industry, work somewhere for a couple of years and then hop a few times to your desired location. Took me 3 jobs over 6 years to get from 70k-100k-130k. Got friends at my first job who have been there for 6 years now and are still just making 90k.