Studying math instead of compsci

>studying math instead of compsci

Where is the "doing" part of mathematics?

Great you wrote this proof, now wheres the "compile" button? How do you have enough fun with it to continue doing it?

How is mathematics NOT completely antiquated by programming (name one mathematical construct you can't represent in a programming language)?'

>chemists run tests in a lab and discover new compounds
>physicists run all kinds of tests involving matter/kinetics/etc.
>even fucking biologists run tests on animals and bacteria
>math…well, we write down symbols and…

Other urls found in this thread:

en.wikipedia.org/wiki/Automated_theorem_proving
twitter.com/NSFWRedditGif

mathfags BTFO

>(name one mathematical construct you can't represent in a programming language)
real numbers

symbolic computation. next.

I agree with the general idea that more mathematics should be intertwined with programming. And don't tell me that expensive proprietary shitware like Mathematica is good enough, it's not. However.

>How is mathematics NOT completely antiquated by programming
This is silly. Plenty of programmers know far too little about math and write overcomplicated and shit solutions to problems that mathematicians have been optimizing for decades. Maybe I'm confused by what you're implying reality should be.

Do you mean computing real numbers or proving that real numbers exist? Real numbers in software are only bound by available memory and your CPU's word size, and computing them is probably the most basic introductory task in software.

Real numbers don't exist. :^)

I'm not saying mathematics shouldn't exist. I'm saying you shouldn't study math because it's fucking useless compared to programming.

>And don't tell me that expensive proprietary shitware like Mathematica is good enough, it's not.

No, not Mathematica, but Haskell, Common Lisp, Python, and C++ are all very capable of expressing mathematical concepts in more powerful ways.

>a code monkey thinks he has an opinion on math

Back to your computer games fag

Mathematics is the Master race

> muh non-computable reals

/thread

>not being a mathfag and codemonkey

>threading yourself

>name one mathematical construct you can't represent in a programming language
who do you think invented them you fucking idiot

...

>useless
I don't know why you thought that would be a compelling argument.

>thinks that there exists a mathematical process that can be done by hand which cannot be done by a computer
Unless you're saying computers don't give us new magical powers, in which case, no shit.

> compsci 2016
> in depth courses on SQL, tree algortihms, Java User Interfaces, etc
> short intorductory courses on NNs, Quantum Computing, Distributed Cloud and GPU acceleration, etc

enjoy your degree, that skill set is perfect for the year 1998.

You're one dense mudda humper

>Where is the "doing" part of mathematics?
all of mathematics
>Great you wrote this proof, now wheres the "compile" button? How do you have enough fun with it to continue doing it?
not sure what you mean but the discovery of some property of a mathematical structure that arises from some other properties is the compilation already established 'steps' so by making a mathematical theorem you're compiling those steps into a statement that stays that if conditions A apply then B.

>How do you have enough fun with it to continue doing it?
reaching a mathematical proof of a statement can be incredibly satisfying .

>How is mathematics NOT completely antiquated by programming (name one mathematical construct you can't represent in a programming language)?'
all of it, computers still cannot do any mathematics . and as for representing constructs computers can only represent a some of the constructs of discrete mathematics .

>chemists run tests in a lab and discover new compounds
>physicists run all kinds of tests involving matter/kinetics/etc.
>even fucking biologists run tests on animals and bacteria
>math…well, we write down symbols and…
you can do mathematics without writing down any symbols , they exist to make things easier and more manageable and to communicate what you deduced to others .

You guys are posting in a troll thread.
and none of you mentioned how numerical methods is a current mathematical research topic,

you cant do anything or represent real numbers in any way in a computer .
for example if you want to add two real numbers a computer cant do that .

computers are only useful for doing computation with numbers which can be represented with integers and most real numbers cant be .

Any real numbers you can add by hand can be added by a computer to the same degree of accuracy. You're being fucking stupid.

> OP complain that studying mathematics does not include doing fancy things like chemistry, physics and biology
> refute by stating that you can do even more boring thing like doing maths without writing down steps
logic bro
btw OP if you are studying mathematics and you are not an AIfag you are basically fucked. learn python and do some applied stuffs or good luck being a high school teacher

writing or not writing down steps has nothing to do with actually doing mathematics . the main reason mathematical notations exist is to communicate mathematics .
> OP complain that studying mathematics does not include doing fancy things like chemistry, physics and biology
well studying physics myself the mathematics was no less 'fancy' then any of the physics , perhaps i dont understand what you mean by 'fancy'.

>Any real numbers you can add by hand can be added by a computer to the same degree of accuracy. You're being fucking stupid.
>degree of accuracy
last time i checked addition dosnt have 'accuracy', either the number you get is the addition of the ones you added up or not .

a computer cannot do the computation
π+π
because it cant represent irrational numbers .
it can treat irrational numbers as vectors and perform operations over the vector space of irrational numbers over integers\rationals but it cant actually do anything with the irrational numbers themselves .

and this is about the calculations a computer cant do , as for the mathematics it cant do its just about all of it .computers are incapable of 'coming up' with new abstractions necessary for mathematics . they can however be very useful for doing things with man made abstractions .
en.wikipedia.org/wiki/Automated_theorem_proving

obviously there isnt anything fundamental about human brains that cant be theoretically done by a computer (which may be VERY different from today's computers) but we're simply not there yet .

>I agree with the general idea that more mathematics should be intertwined with programming

you should look up APL a very math oriented programming language. It is very rarely used these day. Interesting to learn.

>last time i checked addition dosnt have 'accuracy', either the number you get is the addition of the ones you added up or not .
I assume we are comparing to what is possible by hand without a computer, as I don't see the point of other discussion. Any real that cannot be exactly represented by a computer even in principle, cannot be exactly represented on paper either.

>a computer cannot do the computation
>π+π
>because it cant represent irrational numbers .
A computer can process the symbolic expression π+π and produce the result 2π, given a symbolic solver.

>computers are incapable of 'coming up' with new abstractions necessary for mathematics .
That's what programmers are for.

>they can however be very useful for doing things with man made abstractions .
> ...
>we're simply not there yet .
I don't know if you're implying the hardware needs to be modified, or the software doesn't exist, but the former is untrue and the latter is silly. If we can figure out a general approach to a type of problem, we can codify that general approach. That is what matters, is it not?

yes it can
π
here i exactly represented an irrational number

>A computer can process the symbolic expression π+π and produce the result 2π, given a symbolic solver.
as i said it can do that and you can shove identities in there like sin(π)=0 but it still treats π as a 'symbol' and not a number and would do the same for
A+A=2A
so at no point can it do anything with the actual number , only with the scalars of the integer field and identities people put in there .


you can have a computer fuck around with identities such as sin(π)=0 but in no way could a computer come up with such an identity on its own without a human putting an identity that is in some way equivalent to it in .

another thing a computer cant do is something like sin(e) or sin(7). it can fuck around with rational approximations of these numbers while mathematicians can do things with the actual numbers .

...

> computers still cannot do any mathematics

Wrong. Guess you don't know about the field of automated theorem provers and major results from it, one where it proved an open conjecture.

>yes it can
You're going full-blown stupid right now. If you write a symbol on paper, it's the "actual number", but if you represent a symbol in a computer, it's "just a symbol". Okay.

>so at no point can it do anything with the actual number , only with the scalars of the integer field and identities people put in there .
Neither does a human fucking being, are you serious?

>but in no way could a computer come up with such an identity on its own
THIS is your argument? You already provided a source, automated theorem proving. Yes, you are correct that computers don't have sentience. We can still instruct them to follow ANY sequence of steps we would.

>another thing a computer cant do is something like sin(e) or sin(7). it can fuck around with rational approximations of these numbers while mathematicians can do things with the actual numbers .
See above.

The guy you're responding to isn't

Follow more than one post back to see the person I'm talking to mention automated theorem proving.

...

Okay sorry, carry on then.

>If you write a symbol on paper, it's the "actual number", but if you represent a symbol in a computer, it's "just a symbol". Okay.
that's the critical difference between mathematics and a bunch of symbols on a piece of paper .the 'actual number' is in my head , its representation so i can communicate it to other people is on a piece of paper .
same with computing , they symbolically fuck around with these numbers but only the humans looking at the numbers actually know what they are so the computer is equivalent to the piece of paper in this sense .

>Neither does a human fucking being, are you serious?
a human being can come up with the numbers like π and e ,a human can come up with something like e^(iπ)=-1 . these things require you to actually come up with mathematical statements about irrational numbers which computers cant to .

>We can still instruct them to follow ANY sequence of steps we would.
exactly my point . computers are as the name suggests capable of computation , computers are incapable of mathematics as they are incapable of abstraction (today's computers that is ).

you seem to be confused about what field you are talking about. Mathematicians don't write down numbers with "degrees of accuracy", they prove general facts that hold true regardless of how they are written. E.g. you are not going to get a proof of irrationality out of a decimal representation of a number (though of course, by loading a computer full of theorem provers, it could prove simple cases, but it wouldn't use the decimal representation to do so)

Computers can do things we tell them to, and can apply proof procedures that we've told them to (or based off of ones we've programmed). But a computer is not going to come up with the duality between rings and affine schemes. Heck, there are many non noetherian schemes that we can barely compute with, but we know properties about them because we know general principles about commutative algebra and how schemes and rings are related.

Much of mathematics involves building new techniques and new fields, computers are generally useful insofar as checking cases or possibilities once they have been laid out.

So, yeah, as far as your greentext, mathematicians come up with new structures to relate different branches of math in new ways to solve problems and to abstract them. If you ask a computer if two spaces are homeomorphic, it will try a bunch of tests you told it about, but it is not going to invent homological algebra.

i actually mentioned automated theorem proving.it is however computation and not mathematics .they simply have a mathematical statement A and a bunch of 'steps' they iterate a jizillion times to see if they can reach statement B .

they are basically used to check if A->B by iterating manmade steps .and that's not mathematics .
the day a computer can come up with the idea of a limit or the idea of a derivative is the day computers can do mathematics .

also, even though i'm arguing that computers at this point do not in any way do what mathematicians do, is a silly argument. Mathematicians do synthetic reasoning where the thing is "just a symbol" all the time. And even though I really don't think that computers "do" mathematics in the way that a mathematician does (see ), I think there are many formal objects which mathematicians just treat as formal symbols, though we happen to have a lot more knowledge of "why" to try something and various identities related to those symbols... sometimes. In these cases, automated theorem proving will be useful, because its roughly doing similar things.

but that's only a tiny bit of PROVING things in mathematics, not doing mathematics.

First half of your post is responding to someone who was responding to:
>computers are only useful for doing computation with numbers which can be represented with integers and most real numbers cant be .

So take that into context. The point was likely the same I'd been trying to argue, which is that nothing magically make pencil-and-paper "more valid" than data structures.

Rest of what you said I agree with.

why are humans the only magical devices in the universe that can create "real" things out of nothing?

>Much of mathematics involves building new techniques and new fields, computers are generally useful insofar as checking cases or possibilities once they have been laid out.

The world building aspect of math so often gets ignored. This was Grothendieck's big contribution.

>Mathematicians do synthetic reasoning where the thing is "just a symbol" all the time.
i never said they dont , i just said that if the symbol is some abstract mathematical concept , a human who understands it can come up with identities you cant come up with by treating it as just a symbol .

>mathematical construct you can't represent in a programming language
Proof by contradiction.

> a human who understands it can come up with identities you cant come up with by treating it as just a symbol .
This is still a silly argument. A human comes up with identities/formulas/techniques/whatever using other math knowledge. That other knowledge, if codified, would allow a computer to find the same identity.

However, I am now under the impression you are actually arguing the same as and simply using terrible analogies and bad wording. In that case, I agree with what you intend.

Your argument is "if we can train computers to think like humans, then humans would be useless."

>why are humans the only magical devices in the universe that can create "real" things out of nothing?
we dont know if they are we simply know empirically that there isnt anything else which came up with something which we can identify as a mathematical statement .

maybe in the future we'll have computers capable of coming up with new abstractions .since we dont know about anything fundamental neurons do which cant be somehow artificially simulated .

>That other knowledge, if codified, would allow a computer to find the same identity.
while its all probably theoretically possible the vast majority of this "math knowledge" cannot be codified (at least not with anything currently available )

humans are for some reason capable of abstractions like limits which we have no idea currently how to codeify .

>Your argument is "if we can train computers to think like humans, then humans would be useless."
>>However, I am now under the impression you are actually arguing the same as and simply using terrible analogies and bad wording. In that case, I agree with what you intend.

No, that is not my argument. My argument is that the arguments I chose to respond to were silly.

What does an engineer do when he wants to solve a partial differential equation? He'll most likely run it through a solver that uses a finite element method to solve it approximatively.

This method has not been invented by a computer, but by mathematicians. Similarly, if we seek to improve the methods we have or develop new ones, a computer can't do it. It usually takes a mathematician to determine the best algorithm and (more importantly) how well it works and how good the approximations are.

'computer' 'science'