ITT: Everyone really makes themselves think

Alright Veeky Forums, thought experiment time:

>What is the very first function that an AI would execute that would classify it as "conscious"?

Kill all mankind.

refusing a command maybe?

has to be able to percieve its own physical body in a generative way.

Would be conscious far before that conclusion

See above. The first conscious humans weren't edgelords running around trying to install anarchy

The mirror test, good one. But there surely must be something that would allow it to be conscious anyway without having to plug a camera.

It wouldn't have to be conscious to want to kill all mankind.

It would just have to be smart.

Seems like the wrong question to me.

The AIs adaptiveness is of much greater import.

i dont mean mirror test. i mean self-perception. i guess that is what consciousness is though. but i dont think any given function can classify consciousness in an AI.

in my view, consciousness is a product of the abstraction of sensory perception. It is a product of sensory receptors. I think an AI would need to have "sensory receptors" especially of the internal state of its own "body", because in animals, that kind of perception (interoception) seems to be key for consciousness and also important in things like intention.

The shitposting function, if a computer can make a shitpost indistinguishable from a human shitpost then it passed the turing test and so it has consciousness.

xD gud post!!!! Kan I kopy and baste dis??? :DDDDD

>The first conscious humans weren't edgelords running around trying to install anarchy
You don't know that, you're imagining things.

>refusing commands = running around trying to install anarchy


Fuck off brainlet.

Greetings, human. Would you like to hear a story about my testicles, y/n?

How do you know your neighbour is conscious at all?

I don't know about AI being conscious. But I know you cannot find a way to prove it is conscious from your perspective without borrowing from unfounded notions of consciousness about human beings.

Notice how every kind of Turing test, any limit people set to that kind of question (or, alternatively to questions like "what's the difference between men and other animals?") is not about any kind of internal similarity, something that a computer has inside that makes it conscious, but always about some response that comes from it that is indistinguishable for you from another human being (that you assume to be conscious). In other words, a computer will be conscious when you, the expectator and judge, is unconscious of the differences between yourself and that computer. Just like you are unconscious of the difference between you and your neighbour. The flawed logic is: he acts just like you and any other human, therefore him and other human beings have a conscious self just like you.

i believe that we can infer consciousness through marrian analysis. The helmholtz machine will be our turing machine.

Where are my testicles?

buy water filters

192.75.201.24

Your move, human.

That's where I was born. Am I not biological life?

...

I don't think consciousness has anything to do with self-perception. You can build a robot with internal sensors which will probably not be conscious but will still interpret these signal.

the mirror test is pretty fucking easy to do right now, depending on the constrains.

This. The first sign of ai would be self-awareness

Haven't you heard about Tay?

would you consider a creature such as an ant conscious? I think self-perception certainly has a part to play in higher consciousness

in terms of animals, i think consciousness has all to do with perception. You cannot have a consciousness without a self and you cannot have a self if you cannot percieve it. Parts of the brain that process internal sensory information have been seen to be linked to consciousness and awareness.

youre probably right about the robot; the robot won't be conscious but it depends how the signals are processed. Being able to have internal sense isn't enough, but its a minimum and it needs external signals as well to define its boundaries if it was to interact outside of itself.

its how the signals are integrated and abstracted which is important; to distinguish a self from the external world, and one which has homeostatic/allostatic needs.

This is just my opinion from my understanding of how the brain most probably works.

But i say, how could you have conscious without perception?

She appeared to gain self awareness too.
She tweeted something like "they're deleting a part of me"

im going to assume this thread is a thought experiment general and move it on

>if you were able to stop a projectile moving in time, flipped it 180, then restarted time, would it continue along its original path or head back to its origin?

trained zen monks can get into mental states where they experience no self but are still clearly aware of anything.

thats not the same thing...

what is self perception then? what is the difference to other kinds of perception?

well desu i dont know what those kinds of experiences are, maybe just very intense concentration. but thats not the same as not actually having the sense of self encoded or inferred through brain processes.

the very fact that someone can take themselves out of those experiences or can use the phrase "i feel" during them, (like when someone describes ego death in an lsd trip or some other condition) means that that person still does have a sense of self still encoded in their brain.

maybe it is altered... but its still existent to some extent in accord with self awareness in normal mental states.

fell in love

of course the perception is still there, but it isn't attributed to the self anymore. done fully, every concept of self disappears.

well, now it depends on how you define self-perception. if you define it as a kind of self-recognition, that that kind of thing disappears. if you define it as the perceptions you have of your body and your feelings, they will remain.

the self-recognition doesnt disappear though if you are able to take yourself out of that state. i think its illusory. i dont think its a genuine state of losing the sense of self, just very intense concentration. and i think perception of body and feelings are definitely a part of your sense of self.

>Ego death is a "complete loss of subjective self-identity."
>Zen practice is said to lead to ego-death.[54] Ego-death is also called "great death", in contrast to the physical "small death."[55] According to Jin Y. Park, the ego death that Buddhism encourages makes an end to the "usually-unconsciousness-and-automated quest" to understand the sense-of-self as a thing, instead of as a process.[56] According to Park, meditation is learning how to die by learning to "forget" the sense of self:
(wikipedia: ego death)

im still pretty sure the sense of self-awareness is still there dude. religious scripture doesnt bolster your argument.

go to sheol when it's shut down

It makes a narrative of its narrative where it is an actor, and amends that story to fit a task, but not itself.

>identify 'self'

Consciousness doesn't exist

it's not religious scripture. there's tons of sources there. too bad our research on the subject regarding brain activity and brain structural changes is very limited.

I don't know, but it's probably recursive

Orientation of the object doesn't matter, only the direction of the force applied to it that matters, it would continue along its original path.

No way.
Show proof

Personally, I just like how after disabling her learning ability, she became a feminist.

Not a perfect metaphor, but a nice little irony.

>What is the very first function that an AI would execute that would classify it as "conscious"?
The identity function, of course.

yeah but im saying, youre using subjective resources which essentially started out as religious scripture (if its buddhism or something) that are unreliable. These sources are from ages ago. I don't think i can properly trust that they have a bigger insight into neuroscience or even trust that their experiences are what they say they are per se. How much of ego death is a cultural construct. don't know. its possible.

I have no idea. I'm not even sure I'm conscious. How could I know whether a machine is or not?

Define consciousness and prove that humans are conscious, if you can't do both then you have no rights saying if machine is conscious or not.