Who machine learning here?

who machine learning here?

Other urls found in this thread:

metacademy.org/roadmaps/rgrosse/dgml
twitter.com/AnonBabble

I've been doing research in this. I can confirm your pic is pretty spot on.

>who data scientist

What in the world does it mean for a kernel to be unbiased?

what are some good resources to get started with statistical learning/machine learning?
i have a math background but my stats knowledge is pretty basic

wikipedia is good. I would say start by reading up on Linear Regressive models.

A good exercise to attempt to derive the Fourier Transform via least square regression over a basis of orthogonal functions. A knowledge of how linear algebra can be used to form predictive data analytic models is important.

ayy
who here /pytorch/ ?

what do you want to do?

coursera stanford machine learning

thanks guys
i don't really have any goal in mind it just seems interesting

well what's your current area of expertise?

just finished undergrad degrees in math and comp sci

are you better at applied or pure math?
what did you concentrate on in CS?

there are quite a few subfields of machine learning, and it's easiest to start with something your familiar with and then branch out

Curious, what is the setup to this derivation? I understand the least squares regression in linear algebra is performed by projecting the fitting vector over the desired basis by multiplying the transpose but I can't figure what a basis of orthogonal functions is.

>trying to "build" a brain piece by piece when neurologists don't even understand 5% of ours

>be young kid
>love machine learning and programming
>2017
>every faggot wants to do it now as a money grab
>now everyone probably assumes I'm that faggot too

my focus was in set theory and logic so heavily theory-focused
my CS education was very practical, but i did a semester of grad school in theoretical CS at a really good institution where i learned a lot

cool, then try reading this and Google things you don't understand
metacademy.org/roadmaps/rgrosse/dgml

thanks, user

fixed your pic

That's not what neurologists do and we understand most of the brain, the issue is tying everything together in a coherent fashion, as it's very complex. Though statistical learning and machine learning has near nothing to do with biology so your premise is fictional.

Absolutely nothing. Be in statistical learning means being able to bullshit plausibly with an earnest frown on your face.

What is the goal of ML if not intelligence

More efficiently completing tasks for one.

Just jumped on the bandwagon when I saw a cheap machine learning course to spend my free time on.

Exercising my old and admittedly poor calculus knowledge with simple linear regression exercises for now. Collecting a Veeky Forums-related dataset now so I can try to model it: the number of new posts per minute on Veeky Forums given the time of day, day of week, etc. Pic related running for /a/ as a test run.

Seems a little disingenuous to compare it to neural networks then

What's the course?, I'm interested

machines that learn and predict
it's in the fucking name

Well, I could give you the answer, but that's no fun. I'll send you in the write direction, like you ask.

Prove the following (Not too difficult in principal.):
Show that that set of integrable real functions is an inner product space closed under 2 different types of product:
Product of Functions [math][f\cdot g](x) = f(x)\cdot g(x)[/math] (the standard one you learn in high school)
An Inner Product on functions [math]\langle f,g \rangle[/math] (Try to figure out what it is on your own. If you can't then just look up "standard inner product on functions.")

Orthogonality can now be defined on this vector space of functions.
Doing this proof made me very interested in Functional Analysis, leading me down a rabbit hole of 3 textbooks. If no one gets it by tomorrow, I will post the proof.

>Have data
>Pump it into weka
>Run every algorithm on the data until I determine which has the greatest accuracy
A chimp could do this, why is there so much demand?

> be me
> hs student
> find out about machine learning
> start learning statistics in a spare time
> go to uni to maths+stats
> realise that the field is now overhyped and chances to get a job tend to 0
REEEEEEEEEEEEEEEE I fucking want to dies why did you ruin the only think I really liked.

...

>he felt for the stats meme

At least not cs.

I work with pytorch. Been working on analog of tensorboard for this package for a while. I can plot computational graphs and have expandable/collapsable nodes. PLaning to make my repo public soon.

>A good exercise to attempt to derive the Fourier Transform via least square regression over a basis of orthogonal functions. A knowledge of how linear algebra can be used to form predictive data analytic models is important

mate, do you realize that you just reworded the decomposition in fourier basis using this sentence? There's no problem to solve here.

I do. That's the point of the question. The problem is to derive the formula for Fourier Decomposition, not just write it.

haha lol nice those machines sure are "learning" stuff lol they are "learning" because they are artificial "intelligence" right lol those machines are intelligent ahahahahahaha

I just spent 50 hrs learning tensorflow. Am I a cuck?

Chimps are endangered

Since we are talking here about ML. Does anybody have a link to a brainlet tier explanation of how Gaussian Naive Bayes works? Just got into ML and know how random forests and support vector machines work, but can't grasp naive Bayes. I'm not good at statistics, yeah.

t. freshman

that's great! that's the one thing we don't have between visdom and tensorboard for pytorch. How are you dealing with dynamics? just create it based on a single autograd pass?

You guys know this field is a meme right

How much math do I need for machine learning?

Is multivariable calc and linear algebra enough?

It's mostly statistics and probability from what I understand.

how much linear algebra do you need for ML anyways?
i took the required one for cs/engineering but it wasn't very rigorous
i also heard the math for my uni's ML class is fairly assrape

should i just pick up a textbook and start trying to crap out proofs or is that just pure overkill?

Support Vector Machines is one of algorithms based on only linear algebra. Have a look at it.

Only if you know nothing about either field.

holt shit op i actually lost it at this pic

>tfw the field is being over run by asians doing incremental work on conv nets and GANs
>tfw top tier conferences are filled with trash papers

>machine learning has near nothing to do with biology so your premise is fictional
Entirely false. Watch how layer V cortical neurons form their dendritic spines, and encode their outputs. Tell me forward propagation is not already a feature of nature.

a strong backing in the intricacies of an advanced course in linear algebra should be enough. Reading all of "Algebra" by Micheal Artin would be an A+ understanding. Reading all the chapters about Vector Spaces and Linear Operators and Linear Groups would be good enough.

I hope you're aware that you're ruining dating for men.

Why are people content with using LSTM's for RNN's when they're such a shitty hack?

Why do RNN's and reinforcement learning feel like they're making such little progress?

It's not the kernel of a computer, but the kernel ofa vector space

Yes.

It's getting quite complex now. Used to be easy.

Deep Learning .,..

Because they are actually hard problems.

We got very lucky that backprop "Just Works" for 99% of problems you through at it. Unfortunately reinforcement learning doesn't seem to land in this domain.

Linear algebra is essentially the language of machine learning.

Without it you will get lost very fast

would shilov's "linear algebra" or similar be a suitable alternative/preread?
apparently artin is a rather ambitious first book

researcher here. this is true. 90% of papers are absolute trash premises and often can't be reproduced. every once in a while there are a few gems though, e.g. the recent selu paper

kek

serious question. are you jewish? and does tindr algorithm promote racemixing or not?
it definitely checks is you have jungle fever.

not him, but it would make sense if it tries to give you similar shit (including race) in your next matches

>not getting the tinder pytorch joke

considering going for an ms or phd in machine learning
constantly wondering if the barrier of entry will be dramatically lowered in the next 4 years

Serious question: How bad of an idea would publishing a paper on arxiv about decensoring hentai with CNNs be? I'm finishing my MSCS and am worried it will hurt my PhD aspirations, but I have little else to show for research.

Can you find a less nsfw application for it?

This is essentially just super resolution right? This is an active topic of research.

I was a reviewer for ICCV and reviewed a number of interesting papers on this topic. If you have something interesting just use one of the standard datasets they use

It's more complicated than super resolution. With mosaic censorship, it's super resolution + image inpainting. In other cases, it's image inpainting.
I don't know of any novel less nsfw applications of super resolution + image inpainting. Super resolution of mosaic censored faces has been done before.

Another concern is about the data set. I want to open source it, but there's definitely copyrighted material in there ranging from images ripped from paywalled sites to fanart of copyrighted characters.

Just upload the tool and meme it on /a/ and /jp/ with some examples, not every application must have its own paper.

Linear programming was the prerequisite course to machine learning at my university. AI is hard for most people to grasp and it often gets confused with functions that can simply solve, compared to functions that can react to new information.

Anyone here also does genetic algorithms? I've done an introductory course on ML but I feel it's very incomplete for what I want (drone flight automation and unsupervised gamimg). What is the method that gives best results in those areas?

You might wanna try asking at the singularity of intelligence over at /r/MachineLearning

Why would you send me to a place there is only porn?

i want to see how much accidental futa or de-trapping that comes out of that

sorry if this gets personal, where are you doing your MS and how do you like it?

Think about what a random forests classifier is doing: it looks at a collection of decisions from the forest and says "i'm going to classify this object according to the majority".

A naive bayes classifier does something similar: it applies bayes theorem to a problem to figure out the probability that it belongs to one class or another, then assigns the object to the most probable class.

In order to apply bayes rule at all, you need to have some kind of assumption about prior distributions for the features. a gaussian nb classifier assumes the features follow underlying distributions that are gaussian distributed

Dataset excludes futa and intersex. Nothing against them, but I want to make the problem easier.

All I will say is I'm at a top MSCS program in the US. There's plenty of great avenues for getting GTA/GRA work here, but I lack the social skills to fully utilize them.

>Make a formula with thousands of free variables and change them until you get what you want

REVOLUTIONARY

what was your undergrad record like to get in?
i've heard machine learning is basically meme-tier below top20

Did my CS undergrad at the same school so that helped.
3.8+ GPA.
Near perfect GRE.
No research, but 3 semesters as a TA.
Decent recs.

Can't you just go through old CSI episodes and look for applications within the "ENHANCE" meme parts? Would certainly give you a fun icebreaker in the form of "This is now actually possible, forensics is saved". Stuff like numberplates for example, if face is already off the table. Although I don't know what would be novel about that.

The great thing about it is that you can put monkeys on using pre made tools for it. A great excuse to just stack away the vast majority of engineers produced. They dont have to understand shit, just set up the layers, stuff the data in and go.

control and prediction of engineering workforce

probability multivariable calc linear algebra those three are the basics

Probably pretty bad. Just make it about decensoring anything block-censored instead. It will be a combination of super resolution and image inpainting and not only super resolution.

Just call it that and use other data to publish and keep all business ideas for the method to yourself.

I'm getting into machine learning. How do I into linear alg so that I don't break into a cold sweat every time I read the word "vector" in my textbooks?

If you are doing it to get into Machine Learning, then maybe just try a learning by doing approach first and see where that gets you. After a bit of trying this, you might be able to appreciate building solid foundations first more. And then just go rigorously through the material from ground up until you are at an appropriate level to conduct proper research and make your field less of a meme than it currently is.

Take the first year or two of uni some engineering or so and then start some machine learning book. You will need a first course probability, calculus and linear algebra to get anywhere.

...