Brainlet here? What is the significance of proving this? I read the wiki but couldn't understand

Brainlet here? What is the significance of proving this? I read the wiki but couldn't understand.

Other urls found in this thread:

scottaaronson.com/papers/pnp.pdf
en.wikipedia.org/wiki/Descriptive_complexity_theory),
en.wikipedia.org/wiki/Novikov_self-consistency_principle#Time-loop_logic
twitter.com/SFWRedditImages

if true, then we know that a whole class of problems become more easily solvable, we just don't know how to solve them yet

But why? Can you explain in layman please?

you can ignore the mystical non-determinism stuff and think about it like this. a problem in P can be solved in polynomial time. a problem in NP can be checked in polynomial time, i.e. if i give you an already computed answer, you can check in polynomial time whether this answer is in fact correct.

so why does anyone think these classes are related? well basically computing is a young field so there was a ton of low hanging fruit, problems that seemed hard were solvable efficiently (in P) with simple strategies like greedy algorithms and memoization. for example dijkstra's shortest path algorithm, it's like the most obvious approach, an undergrad can prove it correct... you too could have an algorithm named after you if you had the foresight to be a computer science professor in the 1960s!

but there were still some problems where no known technique worked better in general than the brute-force strategy of enumerating all possible answers and checking each one. at first these problems were discovered independently, but researchers came to realize that they were all hard in the same fundamental way, and that any general polynomial time solution to one of them would work for all of them, so all of NP would be in P. that's what the NP-complete stuff is about

so there's a bunch of problems that a bunch of different smart people all thought they could solve efficiently, but none of them could do it, so it seems obvious that P != NP right? except nobody has been able to prove it, and they don't even really know how to go about attempting to prove it. math is hard

Thank you user, is there more stuff about this i can read easy to understand without math background?

Depends on how it's proven. At this point, it seems exceptionally unlikely that there's any practically fast solution to NP-COMPLETE problems. There might be a polynomial-time solution, but given what we already know, it's going to be so slow that it might as well not exist compared to existing known exponential-time solutions. So, in terms of practical computer science, it wouldn't matter that much.

It would be pretty cool as an abstract mathematical proof.

to know more about the current state of the problem without a phd in cs this is good scottaaronson.com/papers/pnp.pdf

math is unavoidable. you should be able to grasp the paper it if you have an undergraduate understanding of math

Thank you,
Seeing this type of thinks make me i wish i got a degree in CS

Correction: Then again, maybe it'll allow the creation of something like rainbow tables to break encryption. I should be more careful about the applications w.r.t. cryptography.

Keep in mind that this only applies to conventional computing. If somebody was able to create a temporal computer, it could quickly solve these problems even if P is not equal to NP.

You mean quantum computer, right? And you would be correct, to a degree. Depends on the exact nature of the quantum computer.

nobody thinks about non-deterministic computers except some theorists a few decades ago (note that the quantum computers you hear about in the news sometimes would not be able to solve NP-complete problems in polynomial time).

No, a temporal computer is a hypothetical machine that uses the self-consistency principle to manipulate the outcome of a "guess". It arranges itself so that a wrong guess causes a paradox, which means the answer can only be right, except in the case where the machine has a higher probability of spontaneously breaking than it does of giving the right answer. In that case, the machine just breaks.

Temporal computers can also do things bordering on the realm of magic. You can have a screen that generates random colored pixels with a button press, and then have another button that triggers the paradox mechanism of a temporal computer. Generate an image on the screen, then trigger the paradox mechanism if it doesn't show you what you want. Same thing applies to generating text. Keep in mind that trying to do this may break the machine or cause the universe to kill you with a random brain aneurysm before you trigger the paradox mechanism, in the case where you ask it for something that is too complicated.

Complexity theory is incredibly hard. Basically, the question of whether P=NP is a question about whether certain problems are fundamentally "harder" than other problems (i.e. the "harder" problems require more work to solve).

There's actually a lot of similar open problems in complexity theory. Just like nobody knows if P=NP, there are many other pairs of complexity classes which nobody knows whether they're equal or not.

Basically human beings are just terrible at complexity theory so far.

>Basically human beings are just terrible at complexity theory so far.

Terrible compared with what, exactly? I'd say we're pretty damn good at it, it's just too hard even for us.

N=1
or
P=0

we fucking suck at it. it's like an entire field of huge open questions

hi computer science here

p vs np is retarded and has no computational application even if solved. no reasonable computer system will be able to solve P and NP problems at proper times.

I don't understand how a proof of this sort of thing could even be presented. It just seems like an incredibly general question.

you're p fuckin dumb. np is a class of problems. perhaps some np problems have p solutions we are at present unaware of. in the general if solved then you have untold advantage over any other business or governmental agency. it's a huge deal. what are you a first year cs or some shit?

Protip: He's probably right. If there was a practically fast solution, we would have found it already. No one cares if there's a 1000 degree polynomial solution for practical application (probably).

no one was able to solve relativity before einstein, so what if we just need the computing version of einstein to come along and revolutionize everything? it seems like we're gonna need him anyway even to prove that P != NP, because no one has the slightest idea of how to do that, which doesn't exactly inspire confidence in our current understanding of the hardness of these problems.

I never took any formal classes on CS or set theory so this question is probably going to sound really stupid but here goes:

Why isn't the possibility for P=NP to be independent of the Peano axioms taken more seriously?

In Section 3.1 Aaronson briefly discusses the possibility briefly and dismisses it because
>At the end of the day, a polynomial-time algorithm for 3Sat either exists or it doesn’t!
But I'm not convinced: you could just as well say
>At the end of the day, a well-ordering of the real numbers either exists or it doesn’t!
and that doesn't stop the axiom of choice from being independent of ZF set theory (any such well-ordering would not be definable using a finite number of mathematical symbols).

Background: AC was proved independent by constructing two models of ZF set theory, one where it is true and one where it is false. A "model of ZF set theory" is simply a directed graph [math](V,\in)[/math] such that, if you pretend the vertices are sets and the edges indicate set membership, it obeys all the axioms of ZF set theory. (E.g., there is exactly one set with no elements --> there is exactly one node with indegree 0.) This still gives you enough freedom to set up the rest of the graph such that the AC property can either hold or not.

The computational-complexity analogue of this would be the Baker-Gill-Solovay theorem which constructs oracles A,B such that P=NP given oracle A, but [math]P\neq NP[/math] given oracle B. Since P=NP is equivalent to the claim that first-order logic + a minimization operator is proof-theoretically equivalent to second-order existential logic (see e.g., en.wikipedia.org/wiki/Descriptive_complexity_theory), it seems that you can interpret each of these logics as a model of Peano arithmetic and obtain something closely resembling an independence result.

Of course this "close resemblance" is not a formal proof, but I'm still surprised that hardly anyone seems to be pursuing this angle.

I did not find this on wikipedia. Does it have another name?

en.wikipedia.org/wiki/Novikov_self-consistency_principle#Time-loop_logic

An important note about Independence from peanu is that the theorems generally require uncountable sets. Unless there possibly only existed an uncountable polynomial for the solution would this problem very likely fall under this issue.

>Unless there possibly only existed an uncountable polynomial for the solution
Defining what it means for a polynomial to be uncountable is a bit tricky since countability is not an absolute property (e.g. Skolem's paradox), but I suppose it could be done.

But why would you need the polynomial itself to be uncountable? Even a polynomial with countably infinitely many coefficients in {0,1} would be enough to encode an oracle for the language {0,1}*.