How Isn't Stephen Wolfram a Polymath?

1. Published numerous often quoted papers in particle physics.
2. Created Mathematica and Wolfram Alpha which has turned into the Wolfram Language, by far the most powerful computational tool out there.
3. Is the biggest name in Cellular Automata. By far.
4. Computational Equivalence is controversal but I think it is true, and it is incredibly powerful.
5. He made dozens of small, but insightful and used, discoveries in a range of disciplines using Simple Programs.

He is just as smart as Poincare or Euler in my opinion.

Terrance Tau is a cuck that is only good at a handful of topics in pure math. Not that Von Neuman wasn't a Polymath, but come on.

Other urls found in this thread:

terrytao.wordpress.com/2007/04/13/compressed-sensing-and-single-pixel-cameras/
twitter.com/NSFWRedditGif

Who says he isn't.
Usual criticism is that he's way too full of himself. Just have a quick look through Idea Makers - you can hear him licking his own asshole as you turn the pages.

He's certainly talented, but nowhere near the likes of Euler (as his ego would have you believe). Terry Tao has amazing breadth as well as depth to his knowledge of math, if there's someone you can ask a math question without them going "lol idk", it's him. There is very few areas of math Tao hadn't contributed to.

Put Tao's contributions next to Wolfram's. At best Tao has more serious contributions in narrowly defined fields.

So? Poincare was the same way. Einstein was the same way in real life and just fed bullshit to reporters.

Tao has groundbreaking results in multiple fields of mathematics (some of which has already "bubbled-up" to industry). Wolfram's only comparably groundbreaking work is in one field- cellular automata.

Einstein is objectively far less brilliant though. He just published first. Kinda like Darwin.

He was hyped by the US to create a consensus among the populace.

I'm actually interested, can you give some examples to one of those results which has achieved industrial applications?

You know nothing of Wolfram or Mathematica or the Wolfram Language. Wolfram is basically a God in Comp Sci. And he has dozens of scientific results in various fields from liquid dynamics to biology to neuroscience to high energy particle physics.

This is accurate about Einstein. Clearly Wolfram is smarter relative to historical peers.

There is some serious wolfram bashing on this board I have noticed.

I have read his book A New Kind of Science and it really fascinating. It's less an educational book than it is a history of science, or to be more precise, a history of scientific thinking! He is trying to close the gap between reality and logic as far as he can and after you're done you may not be equipped with the ability to build your own turing complete CA, but at least with a new kind of intuition regarding complexity and randomness.

fun fact, he recorded every single keystroke he made on his working laptop for over 30 years!!!
Imagine the value of this data. the things you could mine from 30years of work!!! That is dedication!

>fun fact, he recorded every single keystroke he made on his working laptop for over 30 years!!!
That is autistic and useless. He did figure a lot of shit out, but what has that data brought?

>autistic and useless.
autistic maybe but it sure as hell is not useless.

I give you an example of why this practice is useful in our digital age. Text input fields of most programs, apps and online forms (take this Veeky Forums reply box for example) need you to confirm your input to proceed; all the edits you made before confirmed are not logged (or temporary at best) and get lost after your session.

If you have 30 years of edits you will surely find interesting patterns in your editing processes. the sheer amount of data will even out unique events and will make you realise underlying patterns in your behavior.

Maybe this data wouldn't have value to you but you should now understand why every single government or company might be interested to have the keystroke data (also the "deleted" stuff) of every single human.

That is idiotic. Most data is noise which slows down computation. Neck yourself.

By the way, someone close to Wolfram, if not Wolfram himself, posts here. Deduction is a trait of mine.

>if not Wolfram himself

are you stupid?

You'd be surprised.

>2. Created Mathematica and Wolfram Alpha which has turned into the Wolfram Language, by far the most powerful computational tool out there.
[citatiion needed]
>3. Is the biggest name in Cellular Automata. By far.
being good at memes isn't an achievement
>4. Computational Equivalence is controversal but I think it is true, and it is incredibly powerful.
>I think it's true
>I think
ok Wolfram

He is a MANLET and an ugly jew that deserves nothing but the pyre.

butthurt samefagger? did stephen steal your gf? its incredibly pitiful reading your pathetic hate towards a person who does not give a fuck about you. what have you ever achieved?

A lot more than you and more to come. And I am the OP. Three of those are me. You suck.

Aw, did I hurt your feelings? I don't care about what you consider achievements, I came here because I wanted to talk about Wolfram and CAs. I tried to explain why recording keytrokes for 30 years can be made into valuable data and all you this other guy does (and now you) is shitpost.

In all seriousness, if you are not a troll replier then why feel offended?

Can we get back to topic?

>could be
>not is good data
cute
And, yes, Rule 30 is the most philophically intersting thing I have ever looked at, and i was reading Later Wittgenstein at age 15. I am ore than double that now. And Wittgenstein may be wrong.

Rule 30 implies most ideas about determinism are wrong but determinism itself could still be valid. It is fucking earth shaking. I don't know how to describe it.

So are you OP again? would be nice to know...
why cute? It's what you make out of a sample of data what gives it value isnt it?

Yeah he is crazy about rule 30, but what really had me interested was his research on randomness (pic related) which implied that randomness can arise out of non-random systems. It really turned my whole intuition on complexity upside down.

Yeah, I am OP.
I don't think anything you are talking about isn't implied by Computational Equivalence. I guess it is more evidence for it.


I truthfully can't derive Computational Equivalence, and either can Wolfram, but after reading his work it seems is it must be true.

Are we missing something? Because I think what you posted is just an example of the above.

I'm not contradicting you, I also think Computation Equivalence is real but providing evidence of a universal computation cap (with what we assume infinite memory) has been proven impossible by the Halting Problem hasn't it?

Also I think I am far away from being able to critique Wolframs work that's why I don't fully understand what you are getting at. What would we be missing?

>What would we be missing?
We are mostly on the same page, but we are missing the ability to doubt the systems we study. That may seem trivial, but Wolfram ignores it and logic (Godel) demands it. We have to justify the system that we eliminate computations possibulities in the same way we have to justify axioms in set theory. Wolfram basically says that basic shit doesn't matter. And while he may be right, there seems to be a void that has to justify this.

and the halting problem is bullshit and will be replaced eventually
biggest false flag in math
"prove I can!"
cunts

I think the problem and cause of public critique is that he is using a finite sample of computer experiments to imply general axioms. I won't say "to prove" because he specifically underlines at the beginning of his book that none of it's content is supposed to prove anything.

Bare in mind that all of Wolframs experiments follow the rules of logic because they were all performed on a computer. One could even say if a computer is proven to be turing complete, than every experiment on it would also be valid logical expression. So why should Wolfram doubt a computer?

man if you call bullshit on Turings work than you will have to explain yourself..

>man if you call bullshit on Turings work than you will have to explain yourself..
You answered your own question. The Halting problem assumes programs have to run a certain way and there is no way to doubt that is true. It is epiphenomenal in a sense: we can't justify even under its own conditions but it has to be true due to our assumptions.

And I don't think Turing is right, anyways.

>Funny, you are literally saying that you don't believe, the computer in front of you doesn't exist.
No, I am claiming I don't know if the computer I am typing on is the only type of computational entity possible. You are much less smart than I assumed.

>and if I'm answering my own questions, why should I even bother to reply to you?
And yet you are. And yet your are wrong.

kek
of course you deleted it

The OP is RoosterRed by the way. I have some trivial applied math in Econ that more people should look at. I don't claim the math claims are original, but I never came accross any of what i wrote.

steemit @roosterred

No one in compsci gives the barest hint of a shit about Wolfram. He's a literal WHO?

kek
You probably go to a Uni that encourages you learn shit languages like Python.

then express yourself properly because you write like a butthurt teenager.. again his work is proven by reality...

>I am claiming I don't know if the computer I am typing on is the only type of computational entity possible.
Finally an interesting question... what took you so long? Can you define 'computational entity'? Is there a difference between 'computational entity' and 'type of computation' to you? If yes, what?

I think smililar question Wolfram asked would be if binary logic is in fact the computational language of nature. There are a lot of attempts like Fuzzy Logic, Quantum Logic, Three-Valued Logic but none seem to have found an application for it.

I deleted it because I had a wrong double negation in it here, is my repost:

you take this discussion too personal my friend.

>The Halting problem assumes programs have to run a certain way and there is no way to doubt that is true
woah, what? that's so wrong lol.

>And I don't think Turing is right, anyways.
Funny, you are literally saying that you don't believe, the computer in front of you does exist.

and if I'm answering my own questions, why should I even bother to reply to you?

I wish I could quote this effectively but safe to say I want your actual words to prove they are wrong.

>I think smililar question Wolfram asked would be if binary logic is in fact the computational language of nature. There are a lot of attempts like Fuzzy Logic, Quantum Logic, Three-Valued Logic but none seem to have found an application for it.
kek
Wolfram says there are trillions, literally numerically trillions, of possibilities and you post this.
You understand nothing. Not even basic probability theory. I wasted an hour on you.

I probably *went* to a Uni that taught me actual Computer Science, not some faggoty devops/scripting nonsense that you'd qualify for.

No one in academia worth their salt cares about Wolfram. At best he's mildly interesting, but nothing he's ever said or done is particularly profound or useful. Neck yourself.

This is just false.

Read his blog, he sometimes brings it up.
terrytao.wordpress.com/2007/04/13/compressed-sensing-and-single-pixel-cameras/
How is he a God in CS? What about Dijkstra, Knuth, Floyd, Iverson, Hoare, Cook, Milner, ... There are dozens of people who have done more for CS than Wolfram. You're conflating his ego and his achievements.

Nice argument, you dense cunt. What's he done that's interesting and isn't just him sucking his own cum down in a thick cloud of self indulgent farts?

You go to a shit school that makes you learn Python.

>Wolfram says there are trillions, literally numerically trillions, of possibilities
...is exactly what I tried to explain in laymens terms... I can't read your mind.. some people don't even know that there are other types of logic besides boolean. Even so, it's not about how numerically many exist but if there is a governing system above all those...

so butthurt teenager it is then? I didn't gain a single thing from our conversation as well so I think it's time to sage goodbye.

You are a meme. Go ahead.

This is the only thing you can come up with? I go to school that doesn't make me learn any programming at all, since i'm in a mathematical physics programme. Nevertheless every single one of the computer scientists i mentioned has had more impact on CS than anything Wolfram did.

What does that mean? You are intelligible. Even if what you say is true, so what? What does that prove?

The last polymath was French : Henri Poincaré

>Jules Henri Poincaré (29 April 1854 – 17 July 1912) was a French mathematician, theoretical physicist, engineer, and philosopher of science. He is often described as a polymath, and in mathematics as "The Last Universalist," since he excelled in all fields of the discipline as it existed during his lifetime.

I am trying to help you. But you are probably to autistic to understand.

Oh, it's this delusional fanboy again...

You meant unintelligible, you fucking spastic.

He's a fucking cunt. Fuck wolfram.

samefagging is lame

Butthurt Wolfaggot tears are delicious.

you guys are toxic

>The OP is RoosterRed by the way. I have some trivial applied math in Econ that more people should look at. I don't claim the math claims are original, but I never came accross any of what i wrote.
You are the stupid fucking cunt who fuded Kyber. I remember you from biz.

The last polymath was Terrence Tao, whose idea of "i'm bad at this topic in math" means "i didn't publish a groundbreaking paper (yet)"

He did a good job on Mathematica.

>Terrence Tao

Enough with this meme

>to autistic
I mean, come on, man.

observe:

Being a retard is lame