/scg/ - Scientific Computing General

Discuss scientific computation and programming ITT. Keep insubstantial language wars to a minimum.

Previous thread:
What are you programming scientifically, Veeky Forums?

Other urls found in this thread:

juliacomputing.com/products/juliafin.html
juliacomputing.com/products/juliadb.html
julialang.org/benchmarks/
github.com/johnfgibson/julia-pde-benchmark/blob/master/1-Kuramoto-Sivashinksy-benchmark.ipynb
venturebeat.com/2015/05/18/why-the-creators-of-the-julia-programming-language-just-launched-a-startup/
web.archive.org/web/20150411031144/https://abandonmatlab.wordpress.com/
neuralnetworksanddeeplearning.com/
deeplearningbook.org/
deeplearning.cs.cmu.edu/
youtu.be/Qbztx1CfSfo
youtu.be/h-sZ4kgln40
en.wikibooks.org/wiki/Expert_Systems/Prolog
en.wikipedia.org/wiki/Horn_clause
projecteuler.net/
twitter.com/SFWRedditImages

It's 2018 Veeky Forums, why haven't you tried Julia yet?

...

How do I get a comfy scientific computing job, Veeky Forums?

You get involved in a scientific project that involves computation

Why didn't you post it?

no shit dum dum

No fuck off we don't need another general

It's to contain all of the CS shit.

Just hide it.

>Scientific document processor, Emacs-like math symbols completion, LaTeX integration, fast, ditch MS Word
TeXmacs
>CAS and numeric calculations, fast, easy even for brainlets
Maxima
>Easy prototyping and scientific programming, faster than python
Perl Data Language
>Statistical package, convenient
PSPP

Also TeXmacs has Maxima integration.

that's nice and all but what have you done with that setup recently.

Notes, lots.

Tell me more about Julia, user. What's so good about it and why should I learn it instead of other languages? Genuinely interested.

poopy libs

No, these threads attract /g/

They wouldn't if retards like you would stay on the subject of scientific computing instead of turning it into a thread to debate semantics.

Actually, you should probably go hang out in /g/ since you seem to be a brainlet

Syntax of python, performance of C. It's specifically designed for scientific computing so it has all sorts of matrix math and plotting built in like matlab.

Does it have a shit ton of packages and tool boxes, though (for control systems design, for exemple)?

I'm currently writing some shitty finite element code in matlab.
When I'm assembling the stiffness matrix, it tells me that my sparse index expression is likely to be slow.
Lets say, I want to access and change the following elements in my matrix A:
(i,i),(i,j),(i,k)
(j,i),(j,j),(j,k)
(k,i),(k,j),(k,k)
Is there a better solution than A([i,j,k],[i,j,k])?

nevermind. it was all in the documentation of sparse
changed the runtime from 200 second to less than 2 lel

it has some stuff
juliacomputing.com/products/juliafin.html
juliacomputing.com/products/juliadb.html

Never written a finite element solver before. What's it like?

are you familiar with the mathematical formulation of finite elements?

Just that it's a minimizer of some functional applied to a discretized version of the system at hand.

>performance of C
k

meh I was just about to write a wall of text and now the window collapsed and my text is gone.
Too tired to continue now, sorry m8

So what's the advantage when I can just import numpy?

not him, but I have some evidence
julialang.org/benchmarks/
github.com/johnfgibson/julia-pde-benchmark/blob/master/1-Kuramoto-Sivashinksy-benchmark.ipynb

>MIT license

>python

Dont bother learning Julia, the best parts are going to be close sourced and you have to pay for them venturebeat.com/2015/05/18/why-the-creators-of-the-julia-programming-language-just-launched-a-startup/

MIT/BSD licenses strike again.

>keep using matlab and mathematica though, goys
>nothing to see here!

>lets use a crippled software instead

Is this where TLA+ users talk?

I'm using it to define states and transitions. Underrated.

Apart from /dpt/, /g is cancer, therefore /dpt/ and serious programming general(s) should move to /sci

>MIT/BSD licenses strike again.

Fuck off back to you dirty commie. Everything that was released for free still is and will be forever be free. Anybody's contributions still are and will forever be available to all like they intended.

If the creators want to be paid for THEIR additional future work, it is 100% THEIR right. Nobody is entitle to the creators' OWN labor.

>Dont bother learning Julia, the best parts are going to be close sourced

And that's not even what the article said:
>We feel really fortunate that Julia has become such a healthy open-source project – at this point, it is clearly here for the long haul. Some people have expressed concern that we might be tempted to undermine that by handicapping the open version and selling a closed version with better functionality. This would not only be bad for the project, but also terrible for our business. No one has made a good business off this kind of move: it ends up sabotaging the project, which in turn ultimately kills the business. We’re in this for the duration — our goal is to create a vibrant and fruitful collaborative ecosystem, that includes academic researchers, developers who contribute for personal enjoyment, and companies using Julia for business.

They're being paid to fix show stopping bugs quickly and help when devs using it get into trouble like many other similar companies.

Code monkeying is not science. Fuck off.

I think is pretty obvious we are here to have a invigorating discussion about the tools we use and what we do with them. I step at /g/ too (though I avoid the bad parts) for the same but here we can talk about the finer points.

Please dont bring this here.

All science done in the last 20 years incorporates programming in some form, you filthy undergrad.

>All science done in the last 20 years incorporates writing in some form, you filthy undergrad.
Doesn't mean we should have a Veeky Forums general thread.

Stop.

Abandon MATLAB web.archive.org/web/20150411031144/https://abandonmatlab.wordpress.com/

>/g
>/sci
>/dpt/
What's up with this weird forward-slashing? Some Reddit shit?

>All science done in the last 20 years incorporates programming in some form
[citation needed]

>"leftist"
>"communist"
>muh entitlement to private property

python vs matlab. go

Can we collect some resources to put in the OP?

Looking for project ideas.

Currently writing a ray tracer in Haskell.

Errrm.

Maybe a technically a bunch of libraries, but think it qualifies.

Where are some good resources for learning about neural networks and back/forward propagation?

Alright Veeky Forums, I'm taking a class that teaches both Matlab and Maple. Any things I should know going in?

t. Programming noob

Try SML, because MLton exists, which is an entire program optimizing compiler and what they currently use to write proof assistants in higher/cubical type theory because of the insane levels of optimization. Perfect for something like a raytracer which wants high performance.

neuralnetworksanddeeplearning.com/
The Art of Computer Programming vol 4b Backtracking

No easy way to get that on a kindle?

Oops I linked the wrong book
deeplearningbook.org/ it's also free

I'll check out all three, no problem, thanks. Would the best language be Python for ease of use, or C# for generics?

Just use whatever they are using in the books/courses. Btw if you search around Carnegie Mellon uni or Standord grad class sites, they have a lot of these courses open where you get a grad level introduction to the field to prepare you for reading recent papers. That's what you want deeplearning.cs.cmu.edu/

Also, if you look around Youtube you can often find the lectures too recorded youtu.be/Qbztx1CfSfo

Is that available to european students as well? Or just US students?

You are not doing it for credit, you are watching lectures, doing the assigned reading, doing the assignments and tests yourself so you can build a base in the field and then do independent research. Don't contact anybody at the university but yes if you really wanted to you could go to CMU and pay the $40k+ per year as a European student or Johns Hopkins University online Masters in deep learning/ML/stats.

Can't get better than that.

Thank you.

Why is it that newfags think everything is from reddit?

>Currently writing a ray tracer in Haskell.
That is objectively not scientific computing.

Best not to narrow your scope so much. Neural networks are just another kind of function, back-propagation is just another application of gradients to optimize a target. Learn about these things in a more general setting and you will not only understand the latest buzztech, but also be more versatile in the long run.

also, in case brainlets can't understand why only 1 lecture so far it's because the course just started today. Download the youtube vids as they come out as often they get deleted at course end

do any actual physicists or engineers use matlab? or is it just a crutch for students?

Yes

Hey guys is a computational physics graduate program a meme? Is it for failures? Or is it legit? The fact that u of alaska offers it makes me weary

Julia's getting much better package support as it acquires users. You can also use Pycall to import any python packages you want.

Most of Julia’s packages are just Python calls. Meaning the whole language is just a skin for Python, meaning you just made your two-language problem a three-language problem.

Calling C and Fortran libraries is a plus, calling python is just unnecessary (is a bad language and their libraries are horrible).

Except that calling Python from Julia IS a necessity because half of its packages are just wrappers for Python.

Too bad, developers should just focus on what made it popular and that is C performance. Instead is just a python wrapper which is redundant for not to say a handicap.

>half of its packages are just wrappers for Python.
Source?

I genuinely hope you kill yourself.

Leave.

Reported.

kek

This thread is somehow worse than the first.

Oh sorry. I emailed my professor. Turns out I did get an A on it, I got an A- in the class bc I turned in the midterm 2 days late.

>Instead is just a python wrapper which is redundant for not to say a handicap.

Problem is, Python has a head start. Who's going to come up with a Julia-native equivalent for Matplotlib or Pandas in a few months, which has similar functionality and scope?

Julia's interesting and I'll keep reading about it, but for now, I think Python's still the better choice for my stuff. I do have a scientific computing task coming up in a few months, but I can't see myself doing it in Julia. I've just managed to convince everyone in the group to ditch IDL and Matlab in favour of Python, if I now tell them that the new package will be written in another new language, they'll riot.

I started filming some basic finance math videos using Python

youtu.be/h-sZ4kgln40

>I've just managed to convince everyone in the group to ditch IDL and Matlab in favour of Python,
You done goofed, son

My thought exactly.

I was reading on expert systems and stumble upon Prolog en.wikibooks.org/wiki/Expert_Systems/Prolog
Which took me to Horn clauses en.wikipedia.org/wiki/Horn_clause
If you love to read about logic on you spare time do read these.

Why would I read about dead technologies?

You are not the brightest man on earth dont you

The theory behind is entertaining.

Cause is shit

Kek, really? How far from reality does one need to be to prefer IDL/GDL and Matlab to an equivalent Python/Numpy solution?

Have you guys go to Project Euler projecteuler.net/ , is a nice place for practice. I see there the difference in using low level programming languages versus high level, damn is hard to make things at the low level.

True?

>implying Big O is impressive
>implying Taylor series are impressive
>implying assembly is impressive
>implying gate diagram for a fucking ripple carry adder is impressive
>implying picture of a FET is impressive
>implying multitape and single tape Turing machines are impressive
>implying the pumping lemma is impressive

Mathematica

Why are you taking it instead of the class that teaches C++?

New stats major here. I want to get good at R programming before I even take a class in it. Any recommendations?

Logic Gates, Big O. induction are all things that I witnessed in an introductory programming class

r for pirates got me started even though it's written by a brony retard

I do machine learning research (or rather, did, then my employer violated my work contract so I quit). I'm looking to either find someone willing to let me do research in industry (as was the case with my previous employer before going full retard), creating my own company, or pursuing a PhD (I only have a master's, but I don't want to do ML (((engineering)))).
I am currently elaborating the research topic I'd like to work on were I to go for a PhD. Since most modern ML comes in two flavors: incremental engineering-like improvements over previous models; or breakthroughs from mathematics, usually statistics, which actually present a huge class of issues in practice (poor data efficiency, bad stability when the error surface is not smooth enough, which it never is on non-toy tasks, etc.) but eventually becomes quite good (wasserstein GANs vs vanilla GANs are a good, modern example of that), I am thinking that the fundamental approach to ML currently employed is wrong (we notice several problems with backprop and its implications in the first place). Therefore, I think we should move more toward ideas similar to computational (behavioral) neuroscience with the goal to move to locally-updating, distributed, specialized learning mechanisms (still data-oriented).
What material would you suggest in comp neurosci? All levels from single to multi-neuron is relevant (there was an interesting recent paper that suggests neural response is a function of locality of signal, playing with that model could be useful), but multi-neuron inter-communication and coordination is of course probably most relevant.

Yoshua's Deep Learning or PRML (PRML is more like a reference book everyone keeps by their bedside but it's decent to learn from too; Deep Learning is a great modern book on neural networks though).
The best source is always various collections of papers though.