What are you computing, Veeky Forums?
Scientific Programming Thread
Not science or programming.
Im learning C++ at uni. Shits cool, working on programming a pacman clone for a proyect.
matlab gets so ass once you stop doing strictly linear algebra stuff
python is leagues better
Using malloc at all?
I remember a c# class (we programmed in c++ but exclusively had to use c# libraries) where they got pissy at me for using vectors instead of malloc (even though dynamic shit wasn't in the curriculum)
proprietary software has no place in science
Will start my PhD this fall on direct numerical simulations of textured surfaces. Quite hyped for it, the lab I’ll be working at seems to use a lot of C++ and Fortran with MPI, though I’d also like GPU parallelism to be part of my research.
Might as well just kill yourself now
AMPL dawg, what else would you ever use?
Python master-race reporting in.
Been trying to simulate diffusion across a cell membrane with pumps and channels. I think the 1D case is working. Eventually want to try doing a couple of neurons in a "test tube". There's a lot of models out their with things like voltages as the main state variables, but I've been curious as to how things would work with ionic concentrations as the main focus. Thing it's going to take a lot of work to get it to that level though.
Mathematica 11.3 is really great by the way.
I'm currently doing membrane simulations as well! At the moment I'm mostly just getting up to speed with current simulation methods in my advisor's field, but also rewriting a bunch of his old Tcl routines in python and improving them. Looking at simulated buckled membranes to determine their bending rigidity.
Looking at simulated buckled membranes to determine their bending rigidity.
Does that mean you're trying to model the dynamics of the membrane geometry? I hadn't thought to do that, but I guess if you want to look at things like development and pathologies it'd be pretty essential, no? What's the field?
Have a few questions
1 how can i learn github/git? youtube vids. the best way?
2 can c++ do everything fortran can?
3 how do you do long simulations? i dont get it. some code i see is like 5,000 lines. I wrote a very simple code for the interaction between plasma and the atmosphere using finite element method. It was barely 500 lines. I did get an A on it but it seemed so short
Lines of code are a bad way to judge complexity. Git is fairly simple and you could probably learn it from the O'Reilly book which you can easily get for free. Have yet to mess with Fortran.
where they got pissy at me for using vectors instead of malloc (even though dynamic shit wasn't in the curriculum)
Are C++ vectors not dynamic?
Havent had the need to learn/use malloc yet.
Keep in mind this is literally my first programming course.
Theh literally teach us functions, how to pass by reference, arrays, matrixs (however theyre called in english) and recursion and have us program a game now. They didnt even tell us what libraries to use. So im learning everything as I go.
Please go back to /pol/.
Python, R, Mathematica and lastly C
Theoretical/computational biophysics is what you'd call the broad field I guess. We mostly do coarse-grained molecular dynamics simulations, but also a lot of analytic work in continuum-elastic theory, with the Helfrich Hamiltonian
These do belong here
discussion over the FSF and GNU philosophy
These don't belong here
Anyone here use Singular, MAGMA or Macaulay2?
Symbolic calculations over the rationals is an absolute joy.
I wrote a very simple code
Well there's your answer. The third-party radiative transfer code I'm using is something like 45,000 lines. And that's coupled to our simulation which is another 60,000 or so. Things get complicated very quickly..
Only heard of MAGMA, what are the other two typically used for? I agree with your sentiment on infinite precision arithmetic; it feels like how computation was meant to be done.
python for actually doing things.
julia for fucking around
That's nice, user, but can you go into more detail on the "actually doing things" aspect? Are you scientifically computing something?
The other two are meant to help with algebraic geometry calculations. You can do all kinds of cool stuff with ideals and modules.
julia is leagues better than python. Literally no reason to use python.
I took courses in C and C++, and Matlab and python and why do they make us learn this piece of shit
It's a very cool lang, but where in the world does anyone use it for science?
everyone majoring in STEM should get a minor in CS or at least take several programming courses. it's just too valuable of a tool to not be taken advantage of.
YouTube vids suuuuuuuuuck for learning code. Find a text based tutorial, there's plenty.
Large projects don't come out of thin air. They all start as small projects studying one use case that expand to cover more. Writing code that can be easily abstracted for bigger projects as you need it is a skill you'll pick up as you go.
In what way?
Who /NetLogo/ here?
Why is VHDL not on there you fucking plebs?
Fuck off commie.
Because it's not a programming language, it's a hardware description language.
Why is it so hard to program? I had to drop out of CS into Math because I struggle so hard with programming. Even years into my math degree and potentially a math phd I can't program.
os transformation, restoration and reboot
Is python taking over as the most popular programming language?
I've been pretty resistant when it comes to learning it. I prefer Java but it seems all of the statistics/machine learning pros use either python or R. No bullshit, is worth learning?
I prefer Java
I prefer Java
What do you find hard about it?
I can't make anything
I understand boolean and conditionals and loops
But I just can't put it together to make something
I failed out of my colleges python class
I failed out of the Java course
But I did well in computer architecture and operating systems
I'm trying to teach myself common lisp now from the land of lisp book
It's just what I happened to pick up while working.
falling for the hate java meme
It's not a meme
For stats do R and for ML do Python.
Python is dominating every ML library and there isn't as much support for Java in that field at least. Again, this is only in terms of ML & Stats.
Java is shit. Use c# at least
Why is Java shit?
Thanks, have you tried any machine learning in R? I know there are some some libraries but I haven't used them.
sad but true.
What's your recommendation for learning R? Book-wise, I mean. Something to preferentially contextualize it in social science stuff.
What did you try making?
Fortran is still great imho. I would recommend it to anyone.
Nothing will ever overtake C.
1 how can i learn github/git? youtube vids. the best way?
for a quick look
to better understand
when you are still trying to remember the commands
GUI or CLI?
I don't know what those words mean
I'm using common lisp because everything else seems impossible
GUI = graphics user interface
CLI = command-line interface
At the beginner level the MOST important thing about a language is its debugging infrastructure. Choose something with an IDE that allows you to step through your programs line by line, monitoring variables as they change.
I started out with Java/Eclipse and going through these lectures from an intro CS course at Stanford
but anything with the above mentioned capabilities will work fine.
Starting programming is nightmarish, the only way to learn is to stick with it.
just some this and that veekyforums.com
some physics stuff, coil inductance, Complete Elliptic Integrals
Can't tell if baiting or incredibly underage.
Why? only underaged may use C to solve math problems? :P
Only an underaged would think copying a simple equation into C constitutes "solve math problems".
anyway, you don't know what i did
neural networks for unsupervised text summarization
No one talks about that one time when all the top machine learning researchers signed a letter to leave the top journal because it was proprietary and paid access.
It's a momentous move that caused all machine learning, and therefore deep learning, research to go public access, which caused the flood of open research and open-sourced libraries we see in deep learning, which spurred the current wave of research.
And no one talks about this.
Follow this order:
1. Read an Intro book/tutorial
2. Read a cookbook
3. Read an application-specific book
ABM gets no love on Veeky Forums
It's really a shame because most anons take interest in it whenever a rare thread about it pops up
strictly python because I am not a loser
computers are the best tool man has ever invented
they enable you to leverage your brainpower and memory
must know to do science
But you are a loser and choosing python is why.
Lmao at this board, completely comp.sci illerate.
There is no malloc in C#
You're obviously referring to C which is completely different to C# (C sharp).
Idk how U can possibly make this error having done a C course. U must be a special kind of retarded.
Bioinformatician here, I'm using mostly R, python and command line tools for everyday work. Occasionally MATLAB for modeling metabolic networks.
Scientific document processor, Emacs-like shortcuts for completion of math symbols and formulas, LaTeX integration, fast, ditch MS Word
CAS and numeric calculations, fast, easy even for brainlets, integrated in TeXmacs
Easy prototyping and scientific programming, faster than python
Perl Data Language
Statistical package, convenient
R for species distribution modeling and geospatial analyses