Machine learning... without python

>wants to do machine learning
>hates python
wat do

Other urls found in this thread:

julialang.org
ietf.org/rfc/rfc2324.txt
twitter.com/NSFWRedditGif

using python is like riding a self-driving car and saying you're the driver

Use R or any of the canned neural network programs that already exist.

>using python is like riding a self-driving car and saying you're the driver
that's why op is seeking an alternative, darling

Use whatever language you want and write every function yourself.

This desu, it's the only way to understand your program properly.

julialang.org

If the language matters to you it's only because you're relying to much on pre-built solutions in the first place. Any generalized programming language will let you write these algorithms, it's math, it runs everywhere.

>julialang.org
is this really viable in academia or industry?

What the fuck is your problem with python? The python in TensorFlow is just glue. You could use Torch though, that uses Lua instead.

Several finance companies are using, academics start to give courses, fundament are better than Python, 1.0 version will be next year, begin good language to start on ML if you don't like Python.

stop thinking you are so good and that python is for brainlets, python abstracts lots of boring mindless work in the standard library that C can't. if python is preventing you from doing ML you are an idiot. If you can't reproduce python code in C, you are abstracting more than you should. this is coming from someone who has completed that course

Python is almost the comfiest language ever created.

Speed is rarely an issue if you are doing machine learning, as you will be calling C/BLAS subroutines, or using tensorflow to do CUDA stuff.

>Python is almost the comfiest language ever created.
but

R, though it's more on the statistics side

>Python is almost the comfiest language ever created.
Javascript

R, Python, at the end of the day it is just a wrapper to C++.

There is literally one person who shills for julia

That's a good thing though. If the only existing languages were php and C++ I'd choose php

>using python to do machine learning
>not developing your own CPU architecture, writing your own instruction set, then writing your own programming language to do machine learning

comfy is just a code word for boring desu

>Not simulating a computer using ants which are directed to different paths by rewards

It's a pretty elaborate wrapper. I know C++ but would not be able to write Python scripts without checking every two seconds to remember what the Python version is for anything I'd be trying. It's about as distinct a language from C++ as any other compiled language is from C++ as far as I can tell.

>not directly programming the ants brains with a distributed swarm algorithm

Why would you hate python its essentially pseudocode and gets out the way of trying to write super high performance code. Its actually really sexy when you get used to it

The libraries for python like theano/tensorflow are wrappers which essentially interpret your code -> turn it into super efficient C++ -> compile it and run it

There is no way you're going to match the performance of C++ code written by google and accelerated by CUDA.

Is this the course with Andrew Ng? I took it years ago. I thought he used Octave/Matlab?

>using python is like riding a self-driving car and saying you're the driver
Learn what programming is.

Use matlab you lazy fuck

>prove once, run anywhere!

jokes aside:
things like compiled vs interpreted languages actually matter for run time though

surely, you can't be serious user. C is probably the simplest language out there.

>calls others lazy for not writing in a language so lazy it is hardly even programming

Matlab is a glorified calculator. im speaking from experience, i use matlab for work regularly. (and yes, i use functions, etc.. the language is still braindead tier)

Programming is glorified math...

programming is the process of writing algorithmic instructions. much of math avoids algorithms, such as measure theory. Specifically, I make this point to state that Mathematics is not simple calculation.

You don't REALLY understand how the program works until you write it completely in Brainfuck

>things like compiled vs interpreted languages actually matter for run time though
OP said he *doesn't* want to use Python. Though I don't think it's a difference that matters for anything he's probably looking to do. Python isn't all that bad on efficiency, I remember there were a bunch of speed tests some of the Python developers published a while back that showed it was pretty much the same as if you had written what you were doing in C++.

Programming can be done in a very math-like way, but it doesn't have to be done like that. I think if you use functional languages then yeah, you're basically phrasing everything in terms of a mathematical / function type of statement, but if you're using an object oriented language than that's much more its own thing.

ocaml is the only truth

>not having your PI bankroll your personal fabricator

>wat do
Unironically kys

Every single programming language is heavily abstracted from the final machine code on most modern architectures. Even C. So go fuck yourself you elitist, gate-keeping fuck, who has probably never written anything significant in his life.

triggered script kiddie

>gate-keeping
How is anyone gate-keeping you from programming? There is literally nothing stopping anyone from programming except their own mental limitations, even subsaharan africans have computers nowadays and unless you fall for scams / proprietary bullshit you won't be spending a single cent on any of what you need to write programs.
If anything I wish there *were* gate-keeping for programming because it's obnoxious seeing a bunch of random casuals trivialize shit I care about.

if you work on anything hardware oriented then there is some gate keeping cause you need the hardware

He thinks saying "Python is not real programming" is gatekeeping. Of course this does not limit your ability to program anything but it might limit your ability to be considered a "real programmer".

>this does not limit your ability to program anything
it does though

>Javascript
my sides

>php
>over C++
one is a generic programming language running everything from high performance computers to spacecraft, the other is the worst garbage the world has ever had to endure and solely responsible for such vast amounts of energy spent running inefficient wordpress instances, it's a crime. but sure, choose the latter.

>hates python
suck it up and learn python already. there's no real reason to hate it other than GVR being an absolute tool and a few perks in terms of member variable declaration and the Python2 v Python3 wars (GVR again)... as a language, it's easy to learn and write, super versatile and has all the guns you need to shoot at ML problems.

tl;dr
OP should suck it up, people who believe JS is a better language then Python are fucking morons and people who consider PHP worth mentioning in any context should shoot themselves right now

i fully agree with all of this, all of it. so fucking much.

No. Python is turning-complete. Technically anything that can be made in your precious C, C++ or Lisp can be made in Python.
and I say C and C++ because it's ALWAYS Cniles with the elitist attitude.

>Probably writes in C
>Most C elites cant even make a single relevant thing
I've learnt C and I'm now moving onto Python too for ML. (Only learned it for better understanding of syntax in other languages for reversing) Quit being a toxic fag brah.

Protip: hardware exists, and it's what code runs on top of. just cause something is turing-complete, doesn't mean you can do anything with it. i guess if you don't give a shit about performance than you can just use python but if you actually want optimized code, particularly for HPC or embedded, python ain't gonna cut it. inb4 "much BLAS function calls". I agree that Lisp is a jerkoff though

>Most C elites cant even make a single relevant thing
yeah cause nothing relevant runs on C amirite?
>Quit being a toxic fag brah.
this is embarrassing

>People take pride in the fact that they use low level languages
How to spot a brainlet, or an engineering student who messed around with a raspberry pi for a semester

>pure science fag takes pride in his ignorance about the machine he uses for his work

>doesnt understand irony
>cant read
>C elites = toxic faggots
>C programmers = making relevant useful shit
Pick one. Or link something that you've made that contributes to everyday people or yourself.
>inb4 links calculator

A significant portion of the entire field of CS has been in trying to move away from low level programming, because it is inherently error prone.

high level =/= slow
low level =/= fast

In any situation where you actually need to write high performance low level code, you would high a professional who does specifically that shit for a living.

I'm not going to reveal my power level on Veeky Forums, but I do work on "important" projects that contribute to society. And even if I didn't, that does not negate anything I said.

CSfags are moving away from low level cause they have no idea how hardware works. And anytime you add another layer of abstraction, yes it will slow things down. Sure, there is lots of research on how to reduce the overheads and some very promising tools, but this is sure as fuck not done in python.
>In any situation where you actually need to write high performance low level code, you would high a professional who does specifically that shit for a living.
No, that's what you would do. I would do it myself, because I am actually competent in this area. You went from "python can do everything" to "python can do what i need and when it can't i'll just have someone else do it". lol

>I would do it myself, because I am actually competent in this area

Our definitions of high performance computing must be very different then. I'm talking clusters, embedded hardware, real time systems.... You are probably talking about writing code to process your experimental data 10% faster.

>I'm talking clusters, embedded hardware, real time systems
that's exactly what i'm talking about.
>to process your experimental data 10% faster
not really, but that is what clusters are often used for. i'm more of an embedded/realtime guy myself

>i'm more of an embedded/realtime guy myself

Exactly.... a situation where low level programming is entirely appropriate. Yet you have such a stick up your ass you try to convince people that doing machine learning in C is a good idea.

LOL! If they tried, they'd have a good understanding of how it works. Do you know why they back off from it? Because its messy, its pretty damn easy if you invest a few months in it. (at most)

>Yet you have such a stick up your ass you try to convince people that doing machine learning in C is a good idea.
lol no, you are just getting overly defensive. there is nothing wrong with using python for machine learning for many projects. I just said it can't do everything. this was my original comment keep telling yourself that buddy

Telling myself that?! I learnt within two months... Jeez, how long did it take you, if not a few months? Slow one aha.

>tfw functional programming master race

I get to call all of you inferior.

cool story bro. im sure you are super 1337

>cool story bro
Being this much of a mongoloid.

>being a triggered pleb

Congratulations on choosing "purity" over "being able to do anything worthwhile."

>.t side effect

> haha i triggered u get trolled haha !!!
Imagine calling someone triggered while starting an argument to begin with, I'm done with you fool.

k

>just cause something is turing-complete, doesn't mean you can do anything with it.
You can though. That is literally what i means. It might be very shit performance wise, but yeah.

computationally, yes. meaning that you could theoretically compute anything with it given enough time. but saying you can "do anything" has real-world implications. there are many thing you simply cannot do if you have shit performance (real-time systems etc)

That's not the joke. The joke is that you just import everything in python (see pic). It's a shit meme.

use R or MATLAB

kek'd

>not bypassing machine learning altogether and creating cyborg-ants that utilize the ants natural learning capabilities

Really should have been a coffee pot not a flower pot.

ietf.org/rfc/rfc2324.txt

the only thing good about python are the libraries people have built. the language is ass

Look into erlang, its funtional language and just enough high level (but not lisp tier) to be useful for AI, also several features like erlang processes make it somewhat attractive

>bypassing machine learning altogether
>cyborg-ants that utilize the ants natural learning
ok

Compsci here, worked with compilers before. You are absolutely wrong.

Forgot one thing:
>ITT: biology majors try to explain shit they don't understand, so end up saying CS fags say shit they actually don't.
Kys p/sci/chos

>Hate python
Come on kid, week 3 of intro to programming isn't that bad

Reinvent the well, from then on... it will be history.

sup highjacking this thread a little bit.


from a scale of physics ph.d (100) to Indian I.t certificate. (0)

how hard would it be to graduate with a bachelor, then a masters in bio informatics?

we should make a thread with tiers of difficulty based on studies faculty

My negro. F# works as well

High performance computing is usually done in c++.
Embedded in c
Clusters/containers/cloud shit in go
Real time systems in c++

Python is one of the slowest mainstream languages. There's a reason all modern languages avoid being interpreted dynamically typed. By modern I mean swift/kotlin/rust/go/Julia.

For the next 5 years Python will be the king of data science, but I foresee Julia taking over after it gains more maturity. The moment you step outside of numpy in the Python world you're looking at ridiculous slowdowns, and no pypy isn't going to help there.

thanks for making my point I guess?

It's starting to become a thing in academia.
Also it just looks really promising, very suitable for scientific computing specifically.