How is Bayesian Statistics related to Machine learning?

What applications are there for Bayesian Statistics in ML? Does it apply to Deep Learning?

Other urls found in this thread:

people.eecs.berkeley.edu/~avivt/BRLS_journal.pdf
jmlr.org/papers/volume12/ross11a/ross11a.pdf
cs.cmu.edu/~sross1/publications/Ross-NIPS07-BAPOMDP.pdf
en.wikipedia.org/wiki/Markov_property
twitter.com/NSFWRedditVideo

Many, e.g. shitpost detection
Yes, keeping track of probability is very useful but complicates things; it would be annoying as fuck to implement while keeping track of state

> it would be annoying to implement while keeping track of state

What do you mean by that?

you have to keep the probability of being in a state

real bayesian deep learning would be fucking difficult to implement with current algorithms and hardware. You approximate it with estimators.

Look into POMDPs - pretty much the 'bayesian' approach to deep learning

Naive Bayes for general ML is pretty well known

so for example you use a normal distribution as the likelihood of being right, and with bayesian inference you reduce the standard deviation to be more and more certain of your beliefs
there are other tools, bayesian inference isn't the only one used, but it has the advantage of being very fast on computers

Bayes and fisher would btfo of you

>Look into POMDPs - pretty much the 'bayesian' approach to deep learning

Dude what? No... no at all

Bayesian deep learning = deep learning with a prior

>Look into POMDPs - pretty much the 'bayesian' approach to deep learning
>Naive Bayes for general ML is pretty well known

good way to put it

i fucking hate you ML brainlets, especially bayesian ones

probably couldn't even write bayes theorem on a chalkboard

>Dude what? No... no at all
Yes, yes at all.
Bayesian RL can be (and usually is) modeled as a POMDP
Or we can ignore the fact that the belief estimator in a POMDP is maintaining a probability distribution using Bayes' rule.

>chimes into thread without any constructive input
>implies someone is a brainlet
kek
A model using posterior probability distribution explicitly isn't bayesian?
Naive Bayes is a pretty common usage of bayesian stats for ML applications, dunno what your getting at

How would you answer OP's question user?

also papers that you may be interested in:

this one is pretty comprehensive
people.eecs.berkeley.edu/~avivt/BRLS_journal.pdf

jmlr.org/papers/volume12/ross11a/ross11a.pdf

people.eecs.berkeley.edu/~avivt/BRLS_journal.pdf
cs.cmu.edu/~sross1/publications/Ross-NIPS07-BAPOMDP.pdf


also to clarify:
the belief/value/policy estimators in the MDP/POMDP are typically NNs in practice (e.g. AlphaGo, Deep Blue, other deepmind stuff)
Also note that normal MDPs also hold a probability distribution as well, just not explicitly

Bayesian RL =/= Bayesian Deep Learning

There is nothing inherently bayesian about POMDPs

You are just confused friendo

In practice it does. Deep Learning being a buzzword for using multi-layered NNs.

>There is nothing inherently bayesian about POMDPs
the 'M' in MDP, the markov property, is what makes it inherently bayesian.
en.wikipedia.org/wiki/Markov_property

The state transition probability term is what gives MDP formulations the markov property.
This term is a representation of the *prior* of different states.
A prior is a form of bayesian statistical inference.
There is a reason that POMDP literature refers to POMDP states as prior belief states(because the transition term is explicit) and updates the states using the Bayes Rule.
How is this wrong?

Why is she leeching off the labor of other teammates? Typical attention whore.

Inverse problems boiiiiii

MDPs are essentially like a "data structure" used for modeling. They can be used with non-Bayesian methods.

You seem like you have a general knowledge of the field, but you are making statements that just don't make sense.

Thanks!

What about free energy minimising.