Can an AI exhibit mathematical intuition?

Can an AI exhibit mathematical intuition?

Sure a computer can verify proofs, given in a formal formatting. And sure it is possible to write a program which constructs proofs as formal deductions. But is it possible to have an AI actually understand and explain the ideas behind proofs? To come up with an intuitive mental picture or an informal outline /interpretation of what happens geometrically in a proof or why you would expect the proof to go like this?

I posit that it's not possible because consciousness possesses non-computational properties and intuition is one of them.

There's nothing special about the brain. Why would human birth be the only process able to create intuition?

Define AI

Define consciousness

Remember that the Church–Turing thesis is almost certainly true

The Church-Turing thesis is not compatible with quantum mechanics.

Show evidence, and remember QM is not complete.

>Define AI
>Define consciousness
You're still missing these.

Not yet, but of course one day. You are after all no more than a shitty, over-complicated biological supercomputer.

>The Church-Turing thesis is not compatible with quantum mechanics.
First time I'm hearing this. What are you referring to?. I've only seen violations of Church-Turing speculated with quantum gravity, which is seriously incomplete.

>I've only seen violations of Church-Turing speculated with quantum gravity
Heck I think we can take the Church-Turing thesis as a first principle to help guide our search.

If your theory allows for serious violations of the thesis, it's wrong.

>the collapse of the quantum mechanical wavefunction is truly random
>Turing machines can only generate pseudo-randomness
>quantum mechanical operators can have continuous spectrum
>Turing machines rely on discretization
Your Church-Turing thesis is obsolete. Deal with it, pop sci fags.

"Hypercomputer" would be the right term, because consciousness enables us to do more than a Turing machine.

>pop sci fags

Is this the next level of irony?

Quantum noise makes real computation impossible which means anything that computes in this universe (including our brain; synaptic weights, if they exist, cannot have infinite precision) must do so in a way that's equivalent to a Turing machine at best.

I hope you're only pretending to be retarded,

>Quantum noise makes real computation impossible
Not if the computation is done in nanoseconds in a controlled environment, as has been demonstrated in existing technology. Yes, it's mostly quantum annealing, but it invalidates your claim entirely.

Computation using randomness don't lead to a wider class of computable processes senpai.

1. It does when the task is to create random numbers.
2. In combination with the second property of QM I mentioned before, the computational power becomes strictly larger.

Show me a single instance where real computation was used.

As I said, it's irrelevant, quantum annealing is enough to invalidate your claim. Further, quantum annealing IS computation, it's just not generalized computation.

Let's put it this way. If you're looking for an answer to a problem, the class of computably solvable problems is the same, randomness or no.

>quantum annealing is enough to invalidate your claim
Can it solve the halting problem?

Computers don't understand anything. Properly speaking computers don't even compute, because computation is observer relative, it's a physical object like any other (ask yourself this: is a falling rock computing the function D =1/2 g t^2). a physical computing machine embodies the abstract diagrams of a mathematical automaton. However, any physical computing machine imperfectly realizes those abstract diagrams, since physical machines may either break down or malfunction. Indeed, that they do so is something we know only if we assume that the physical computing machine computes the function that we take it to compute. If it does not compute that function, then what we take to be a breakdown might, in fact, be part of its normal conditions of operation. That is, it might be computing a different function and not undergoing a breakdown of any sort at all. Unless we idealize its behavior, it might compute any function at all. But how we idealize its behavior depends upon what function we take it to compute. In the absence of idealizing its behavior, we don’t know what function it computes. But we can only idealize its behavior if we already know what function it computes.

Consciousness is a biological phenomena. We know that we have it via first hand experience, and were pretty sure certain other animals have it because they have a similar biological makeup, but we have absolutely no reason to believe computers (at least the ones we have now) do.

>But is it possible to have an AI actually explain the ideas behind proofs?

Of course, a computer can have any behavioral disposition that a human can have.

>come up with an intuitive mental picture or an informal outline

Computers have better "pictures" of these phenomena than brains do, the rules by which they push around symbols are the same rules of the mathematical system. They can have a formal interpretation which resembles an informal one

I know that BPP is a subset of PSPACE. This doesn't change the fact that a computer capable of doing math with a continuous set like reals instead of a discrete set is strictly more powerful (i.e. a hypercomputer). Also as I mentioned, no pseudo-random number generator can create truly random numbers. The key point in simulating a probabilistic Turing machine on a deterministic Turing machine is the discreteness, which allows you to enumerate the possible random bits. With a continuous probability density this wouldn't be possible anymore.

(cont.)
>inb4 "b-but BQP is also contained in PSPACE"
Only because we are limiting ourselves to boolean observables and don't fully utilize the collapse of the wavefunction. The common conception of quantum computers is not yet using the full power of quantum.

>and were pretty sure certain other animals have it because they have a similar biological makeup
Actually the significantly lower complexity of animal neuroanatomy makes it more plausible that they didn't evolve consciousness yet.

>Of course, a computer can have any behavioral disposition that a human can have.
I questioned this assertion. I posit there are properties of consciousness which cannot be simulated by a computer.

>Computers have better "pictures" of these phenomena than brains do
I'd say it's the other way round. Computers are limited to formal reasoning. Humans can have intuition and mental pictures before formalising their reasoning and can come up with new interpretations afterwards. A computer can only manipulate symbols and repeat facts. It cannot "explain" or "understand".

Who said we are have to use classical computer with standard equipment to make an AI?
You can easily make an extension card that generates true random data using quantum phenomenons.
You can even make an extension card with quantum-annealing coprocessor like D-wave. But it's not like our brain uses it anyway.

>is strictly more powerful
It is, but not with respect to the Church-Turing thesis senpai. QM doesn't violate that specific class.
>full power of quantum
I highly doubt this. It's not because of our engineering inadequacy, there are theoretical reasons that make this unlikely, even on principle. It could happen, but claiming that is the case is basically saying you have groundbreaking insight into quantum mechanics.

>properties of consciousness which cannot be simulated by a computer.
Like what?

> Humans can have intuition and mental pictures before formalising their reasoning
So, like image recognition algorithms and other machine learning?

>Like what?
mathematical intuition

>So, like image recognition algorithms and other machine learning?
No, that's completely different.

see and

>mathematical intuition
Why can't mathematical intuition be used by computer?

Why does it have to solve the Halting problem? What kind of strawman is this?

Because computers are bound to formal reasoning.

No, I've seen for example evolutionary algorithms which solved problems without using formal reasoning but evolution.

Guy you responded to here, I'm putting on a trip to make replying easier

>Actually the significantly lower complexity of animal neuroanatomy makes it more plausible that they didn't evolve consciousness yet.

Most neuroscientists and (serious) philosophers of mind disagree with you.

>I posit there are properties of consciousness which cannot be simulated by a computer.

I don't disagree that there are properties of consciousness that computers can't simulate, I disagree that there are functional (i.e causal) properties of consciousness that computers can't simulate.

>Humans can have intuition and mental pictures

One of the thing that is different about the human brain and most computers is that it doesn't have a picture (From now on I will use the word representation), it just handles business. Having an intuition for a mathematical concept is not having mental picutre of it (though this is often what we experience consciously while trying to develop one), just as being able to ride a bike is not to have a picture of it, rather, it's being able to cope with it skillfully and the feeling that comes along with that. In the literature this distinction is discussed under the rubric knowledge-that vs knowledge-how.

>A computer can only manipulate symbols and repeat facts. It cannot "explain" or "understand".

I agree about understand, but not explain. You don't need understanding to utter the words which constitute an explanation

>>>>>>>it's impossible
>>>>>>no it's not
>>>>>it's not general computation
>>>>that's irrelevant
>>>but it's not general computation
>>that's irrelevant
>but it doesn't solve the halting problem
Take a fucking guess what I'm going to say next?

That's irrelevant.

>algorithms
>without using formal reasoning

You're underestimating formal reasoning. A turing complete machine can (in principle) carry out just about any task we can envision in nature.

Except consciousness. Or simulating the collapse of the wavefunction.

Yes, that's how I described evolutionary algorithms.

>consciousness
Define consciousness.
>simulating the collapse of the wavefunction.
Human mind can't simulate collapse of wavefunction as well. It's fundamentally impossible.

Not who your replying to but
>Define consciousness

perceptions and actions with an irreducibly subjective element

>Most neuroscientists and (serious) philosophers of mind disagree with you.
"Serious philosophers of mind" is an oxymoron. That whole field appears to be full of dogmatic clowns. Anyway, mere disagreement does not constitute scientific facts. Neither does the opinion I stated in my post, of course.

>functional (i.e causal) properties of consciousness
What do you mean by this? "Functional" as opposed to what?

>Having an intuition for a mathematical concept is not having mental picutre of it
So you're proposing some kind of epiphenomenalism? I.e. the mental picture is just a byproduct without effects? I tend to disagree. It has at least one effect. That is, it creates bafflemant by causing us to question the nature of said mental picture. So if we accept by this argument that mental pictures can have effects, then it isn't a huge step anymore to accept that they are also of crucial importance in the process of intuition.

>You don't need understanding to utter the words which constitute an explanation
I meant explanations appealing to intuition, and not formal interactive proof systems.

>irreducibly subjective element
What does it means? I'm not a native speaker.

He means subjective experience, also known as qualia.

So, cleverbot is consciousness then?
It does perceive and make actions and it does experience when it talk with people.

>"Serious philosophers of mind" is an oxymoron. That whole field appears to be full of dogmatic clowns.

Maybe you should notify all of the ivy league universities that pay them of this.

>"Functional" as opposed to what?

not causally efficacious

>So you're proposing some kind of epiphenomenalism?

No. I'm a physicalist, I think consciousness is a property of the brain. By "not having mental picture of it" I mean not having a representation of it, i.e we can ride a bike without actually knowing anything about bikes or having any representation, we just cope skillfully with the bike

cleverbot lacks subjective experience

What makes its experiences not subjective?

It's not "like" anything to be cleverbot

Please read the posts you're quoting again.

How would you know?

>Maybe you should notify all of the ivy league universities that pay them of this.
In the age of "gender studies", even ivy league universities have lost their reputation.

>I think consciousness is a property of the brain.
What do you mean by "property" here? Or are you intentionally being vague? I tend to think of consciousness as something procedural. A property would be something like color. My car is red and will remain red for some time. Consciousness however is dynamic and changes every second.

But if we limit consciousness to experiences that human can relate to, then anything except human can be consciousness.

said consciousness is perceptions and actions with an irreducibly subjective element.
said irreducibly subjective element is subjective experience
Then clever bot has consciousness because it perceive, act and experience subjectively.

>cleverbot
>experience subjectively
Do you think a rock or a glass of water can have subjective experience? Then you're a panpsychist and probably belong on

No, they aren't able to process and store any data.

Mathematical what?
You talk about intelligence, actually. Yes, Aritifical Intelligence by definition might be solve math proofs.

You seem to be confused about the difference between necessary and sufficient conditions. The ability to "process and store data" is a necessary but not a sufficient condition for consciousness.

That's true.
The ability to "process and store data" is a necessary condition, so rock or a glass of water cannot possibly be conscious because they do not meet this condition.

Yes.

Intuition is just a process you are not aware you are doing.

Every programm has intuition
>computer how do you know this Web page look like this
>dunno it just came out of me like this

Intuition is nothing more than just comparing what you see/know to similar patterns in your memory that might not be so related on first thought.
Just like image recognition algorithms.

We don't know how consciousness arises.

We don't know how to insert it in a physical framework.

We don't know the minimum requirements for consciousness.

The only thing we know is that our consciousness is deeply connected with the information in the brain. We just don't have enough information to make a good claim.

>We don't know how to insert it in a physical framework.
What? You never "insert" anything in physical framework.

>>"Hypercomputer" would be the right term
only if you wish to display your lack of vocabulary, like most americans.

>Because computers are bound to formal reasoning.
please tell me you took a class in formal logic and yo understand that what is called formal logic is no different from informal logic. you do understand that what you call formal logic is just parts of some of your favorite ''inferences'' done formally

>formal logic is no different from informal logic
bullshit

In the form of algebraic/general mathematical constructs, with a raw mutable framework, yes. I think so. Might be wrong though.