Who needs mathematicians in the era of cheap, fast computers with numerical methods software?

Who needs mathematicians in the era of cheap, fast computers with numerical methods software?

Other urls found in this thread:

en.wikipedia.org/wiki/AI_effect
en.wikipedia.org/wiki/Artificial_bee_colony_algorithm
twitter.com/SFWRedditVideos

Computers can't prove and invent shit. The only people that are unnecessary labourers and computer scientists.
Kys

>Computers can't invent shit
I know we need engineers and inventors, my question was about mathematicians.

Who made the numerical methods and who will make the numerical methods in the future?

The're already developed, we can just replicate the code.

>Implying being an inventor is mutually exclusive with being a mathematician or engineer
Again, kys

Of course it's not.
If I asked
>do we really need experts on ladybug mating behavior?
would you answer
>yes, because some of them may be plumbers
? Sorry but you're the brainlet here

Good luck trying to understand the genetic code with those computers.

Explain how mathematicians are crucial to understanding the "genetic code".

You can't solve most differential equations numerically and can't even represent most numbers on a computer.

Mathematicians invent math shit, that can be applied to things like deeper physics, new optimizing strategies for whatever, etc.
Computers can't do these things (optimizing as implemented yes, creating new strategies no), so mathematicians cannot be replaced and certainly are not trivial.
Algorithms were written before the existence of computers, were algorithms useless? No, because we found an application for it.
The simple fact that you cannot see this for yourself makes you the brainlet here, my friend.
Third time now: kys

>Numerical methods are already developed
Yeah yeah, all space travel is already developed. We can just replicate the protocols now, because it's all the same.
~ Your logic

The topic is nonsense because you obviously don't understand what mathematicians 'do'.
It ISN'T arithmetic. If a mathematician needs two numbers multiplied, he uses a calculator like the rest of us.

Mathematics is about LOGIC, deducing things based on a minimum number of assumptions. The fact that mathematics is useful (physics, cryptography, genetics, economics) is incidental. Mathematicians would go right ahead if math had no "practical" applications -- except that there wouldn't be so many corporations willing to keep them on the payroll.

is an asshole. New algorithms are continually being written. Some problems can't be tackled just by making chips that run 10 (or 1000) times faster.

Mathematicians don't find applications for their algorithms. Engineers, chemists, physicists, economists, statistcians and others do. They make appropriate assumptons, apply known laws to create models, which are then solved mathematical algorithms.
>differentiation, integration
>solving differential equations
>finite element analyses
>matrix/vector calculations
>iterative calculations
>cluster analysis
>curve fitting
And countless other tools have been already developed, and now we hardly need mathematicians anymore. We DO need the other people that I've mentioned though.

>Why would anyone need to understand what they're doing? When I bang these two rocks together it sometimes does what I want. Isn't that good enough?
The fuck is wrong with you?

Computers aren't fast enough. Simulating certain processes like chemical reactions will take thousands of years with our current computers. We need better numerical methods.
But computers can prove shit.

>Computers can't prove... shit.
Except they can.
>Computers can't... invent shit.
Except they can.
You probably mistakenly believe the inventing one isn't possible because a programmer has to write the program and that makes it "not count," but in reality programs aren't limited to needing *explicit* instructions. You can program the approach to letting it train / learn based on minimizing an error function over a known data set, and when you do that the program can end up capable of solving problems in ways the programmer himself doesn't even have direct insight into or knowledge of.

"Understanding" reduces to non-understanding sub-tasks of ordinary cause and effect relationships. There's less difference between what programs do and what people do than people today give programs credit for because we tend to want to put our own mental algorithms on a pedestal as doing something special and irreducible.
en.wikipedia.org/wiki/AI_effect
>The AI effect occurs when onlookers discount the behavior of an artificial intelligence program by arguing that it is not real intelligence.

Nooooo, you need us for things you woukdn't understand. What are you, an anti-semite?

You think we are at the end of technological/mathematical progress.
You are a total fucking retard.

No one said that mathematicians find the applications. I said that they create things that you can only find/prove things to be true via mathematics others can apply to their science (and yes, this includes numerical methods).
Learn2read, you fucking imbecile

>I said that they create things that you can only find/prove things to be true via mathematics others can apply to their science (and yes, this includes numerical methods).
Of course.
Historically mathematicians have developed a great many tools that are used in all areas of science.
Nowadays though they're just focused on abstract head-in-the-clouds problems that serve no practical purpose, except for collecting brownie points from other mathematicians.

A self-learning program is nothing more than a program that is PROGRAMMED TO optimize itself. It doesn't invent anything. It's pure trial and error. It's a fucking donkey remembering to not walk into a rock. Doesn't matter if we didn't know that rock before. Maybe it gives us insight into a new problem, but it doesn't invent.
A computer doesn't prove anything.
You obviously don't know what mathematics is.
Kys

You don't know the applications yet. Just like we didn't know 2000 years ago for algorithms.
You are a close minded retard.
Kys

>A self-learning organism is nothing more than a program that EVOLVED TO optimize itself.
See, I can make a compelling caps lock based argument too.

I'm going to kms now

Except the "pure" mathematics quite often turns out to have "practical" applications.
You just never know in advance.

G.H.Hardy wrote a famous essay about how he was proud that nothing he'd ever done would be of practical use.
He'd be appalled by the way his work HAS been utilized since his death.

False equivalency
Programming =/= evolution
Human programming is just a metaphore.
Kys

ok then let's have them develop the hundreds of years worth of theoretical framework required to come up with those algorithms as well

Thank you for your understanding

Mathematicians are fucking useless

If an engy/physicists/etc makes a wrong model, it will simply not work, if we were blindly making models, we would still reach the same result without mathfags by the evolutionary process of the scientific method

Mathfags "prove" shit that scientists already knew experimentally, but muh rigor

>Short term optimization isn't exactly the same as long term optimization
They don't need to be exactly the same. We can probably be a little more efficient than the millions to billions of years timescale nature works with.
And the point still remains that there isn't anything *fundamentally* special about what a human does that can never be reproduced artificially. That's just a retarded position to try to hold onto. Do you think your brain is made out of pure irreducible fairy magic you idiot?

>theoretical framework required to come up with those algorithms
That's the point, a theoretical framework isn't necessarily required. It's possible to get working answers without having any sort of "theoretical" basis you're personally aware of. The basis is there all along in the sense that someone could do the work of discerning it and writing about it if they wanted to, but you don't need to know what that basis is in order to have solutions that work because of that basis.

what do you think would've happened to the field of numerical analysis if taylor hadn't come up with his idea of function approximations, or without the works of newton and others, whose works are essential to understanding optimization techniques.

More so, anything beyond introductory statistics/probability is very hard to understand without having some basic knowledge of mathematical analysis.

>what do you think would've happened
There are more approximation techniques than you can count, any one of them not existing in some alternate history timeline would at worst just mean less efficiency for a while. The idea of approximation is a pretty obvious one in general, it's almost impossible not to eventually come up with an approach to accomplishing that.

Lol, guess what motherfucker.
Every sufficient appoximation method with an error estimation is based on a theorem/invention made by a mathematician/person with a mathematical background.
Lmao retard!

That's why you learn about and advance numerical methods, aka a branch of mathematics. You learn to get the magic. Of course it doesn't come by itself. Lol what a troll argument.

Just because you can relate any heuristic back to a mathematical theorem doesn't mean the person implementing that heuristic himself had any knowledge of that mathematical theorem.
Sometimes someone just stumbles on something that works e.g. the fast inverse square root probably wasn't written by someone who was working purely in terms of rigorously defined mathematics (and in fact others who did try after the fact to come up with the same solution using rigorously defined mathematics ended up with different, more explicable but less efficient methods). You can identify use of Newton's method in there as a step it takes to increase accuracy, but the main steps taken to get an answer prior to that accuracy increase are still not clearly pulled from any particular known mathematical method.

>You can't solve most differential equations numerically
???
It's literally the reverse, you can't solve most diffeqs analytically but it's piss easy to solve numerically.

>can't even represent most numbers on a computer.
Yes you can, but floating point representation is used for numerics for performance reason.

P vs NP, because some problems will grow exponencial makes even some computer 10^ 100000 more power unable to get answer.

Needs mathematics optimization found new methods and structures, for example Deep learning or machine learning come more close to mathematics or computer science.

someone had to first invent the tools so that monkey could use them

>That's why you learn about and advance numerical methods, aka a branch of mathematics.
Genetic algorithms are pretty devoid of explicit mathematical theory. They basically amount to just copying nature. And I'm pretty sure the guy who came up with this for example:
en.wikipedia.org/wiki/Artificial_bee_colony_algorithm
Is only a comp sci academic, not a formal mathematician.
I don't think you can ever prove mathematically rigorous theory is literally necessary for approximation / optimization, it's more that mathematically rigorous theory is capable of increasing efficiency. The fact natural evolution worked is evidence you can optimize without consciously discerning mathematical principles beforehand. Nature of course knows nothing of mathematics and still gets results.

Nothing unsolvable analytically is solvable numerically . At best you can get approximations.
And you obviously can't represent most numbers on a computer, you can't even represent most rationals, let alone all the rest.

>At best you can get approximations.
How are approximations a bad thing? For any given task you're trying to accomplish there exists an approximation to a "perfect" answer that's more accurate than what you need to accomplish it. There is no reason to increase accuracy beyond whatever that minimum accuracy need is.

>Nothing unsolvable analytically is solvable numerically . At best you can get approximations.
It's common to call the result of a numerical scheme a solution, even if it isn't in an analytical sense. Stop being such a sperg.

>And you obviously can't represent most numbers on a computer, you can't even represent most rationals, let alone all the rest.
Yes you can. A computer is well capable of building an expression tree. That's exactly what math programs do.

>Nothing unsolvable analytically is solvable numerically . At best you can get approximations.
>And you obviously can't represent most numbers on a computer, you can't even represent most rationals, let alone all the rest.
And literally no one cares

I didn't say approximations are bad just that you can't solve things numerically with computers.you can only represent a tiny sliver of the rationals even if you have all the computer memory that will ever exist from the big bang to heat death.
High level abstractions like everything that's defined using limits are neccesery and computers can't do any abstractions,and you can't as far as we know program them to come up with abstractions .

Of course everyone cares senpai . Differential equations come up all the time in many fields . Obviously you'd want to figure out general solutions .

>you can't as far as we know program them to come up with abstractions
The programming language Lisp was literally created specifically to do symbolic differentiation with programs.

Literally every language has a symbolic math library, you fucking ledditor.

I am right, you are wrong,kys; the thread

I know they do, but usually an analytic solution can't be found, and in such cases an approximation is absolutely enough.
Remember that we live in a physical world with physical limitations. No point in calculating the height of your house to the precision less than a diameter of an atom.

Einstein would not have discovered general relativity without mathematics.
A lot of boundary pushing theories in physics make use of contemporary mathematical concepts and vice versa.

>discovered
*stolen

Einstein didn't live in an era of cheap, fast computers with numerical methods software

Oh yeah? Then tell me how "cheap fast computers" and their "numerical methods" can prove theorems that make the lives of everyone easier?
Spoiler: they don't.
You need people working on abstract math so you faggot engineers can perform your shitty jobs easily without wasting too much resources (or even not solving the problem at all).

Here's a dumb example: a bunch of numerical methods uses results from linear algebra in order to work, if no one worked on building linear algebra and proving shit for it, then you wouldn't have you numerical methods. And guess what? Computer can't do that. We indeed can use them to check if a result holds for a given interval, but computers can't think to the point of discovering theorems on their own.

We used to need horses for transportation too.

Yeah, come back here when you have a computer that does what mathematicians do.
Until then, go fuck yourself and merry Christmas.

Come back when you have a mathematician that can do what computers do.

What the hell do you mean? Of course we can't and won't be able to perform calculations that fast, but that doesn't mean that computers or mathematicians are useless. We need both. Mathematicians (humans) can think in a way that computers can't. We need mathematicians to work on unsolved problems in order to make everything easier or to make things possible to solve. We also need computers to perform a lot of calculations, which we use in simulations and a bunch of other problems in engineering and science in general.
By the way, read this .

>Mathematicians (humans) can think in a way that computers can't.
I don't think "can" and "can't" are the right words there.
If humans behave according to physical cause and effect bodily processes then why would you believe a non-human machine "can't" implement something comparable?
It sounds to me like you're making more of an argument for investing resources into making machines approximate more of what humans do that they don't already approximate, rather than an argument for investing resources in having humans continue to come up with new answers to problems themselves.

>If humans behave according to physical cause and effect bodily processes then why would you believe a non-human machine "can't" implement something comparable?
The problem here is that nature had A LOT of time to build our brains so that we could use them for what we do. I'm not saying that it is impossible for machines to do the same, we sure need to invest on building smarter computers, but while we can't reach the level of a human brain with a computer, we should keep using humans to improve mathematics. Maybe some results in math would help us with building such computers.
My point is, for now, we can't just cut investment on both. We need to keep upgrading everything we can until something better comes up.

I really hope, by the way, that computers someday will be able to do a mathematician's job, but for now, there's nothing we can do and we still need these people working on math.

>"Oh you're a math major? What's 474832941 times 2821474282339"

Retard.
You are actually hoping for computers to continu/replace the legacy of humankind.

You are too butthurt to admit you got a grudge against mathematicians and are releasing your anger on the internet as a cuckboy.
Kys

Analytical solutions give insight into the relationship of parameters. You can't derive shit from numbers.

Take a physics course.