Can consciousness exist digitally?

Can consciousness exist digitally?

Other urls found in this thread:

youtu.be/-nbTrPwQudo
youtube.com/watch?v=nQHBAdShgYI
twitter.com/AnonBabble

If our brain is just a whole bunch of stimuli and maths, then yes.

youtu.be/-nbTrPwQudo

Thank me later

Lain is awesome.

on topic, we don't know enough about consciousness to answer that question. so it's a philosophical/theological question.

Potentially, doesn't mean you can transfer an existing conciousness to a machine though but you could create a copy

Maybe, but I suspect our massive amount of misfiring neurons might be an essential feature though, to say nothing of brain chemistry so the programing would need some kind of trick to it. At the very least it'll waste a lot of available processor power.

Both are just electricity and nothing more, so yes.

Voltaire said "I think therefore I am" and HE was just a character in a shitty universe simulator so...

>consciousness
Define "consciousness" in empirically verifiable terms...

...I'll wait.

Purely hypothetically speaking, yes. How would we know that it's a consciousness? That's a very good question, let me know if you figure that one out.

How do you know someone you meet has a consciousness? When you think about it, you don't. But you know you have one, or rather you think you have one, so it only seems fair to extend that assumption to the people you meet.
> At the very least it'll waste a lot of available processor power.
This is very true. We've been able to make a digital worm brain but the question is could we ever hope to reach the complexity of a human brain?

>Consciousness can exist digitally
>You can't transfer current consciousness to a machine.
>You can create a copy though.

Explain this. You can copy and past it, but you can't transfer or cut then paste?

All I hope is that before they create a functional AI, that they'll have a virtual tactile environment and body for it to interact with. Imagine being a brain uploaded to an absolute nothingness.

Cut and paste is still just a copy, the existence is not continuous.

>could we ever hope to reach the complexity of a human brain?
Seems inevitable, really... Computational power and scan resolutions are going nowhere but up. Models of the brain are only become more in depth and closer to complete.

Even if we never figure out how to code up AI from scratch, we'll eventually, and more or less inevitably, have a simulated human brain with sufficient simulated input not to go catatonic off the bat. ...at which point I'm sure we'll look back at how valuable all these threads were to the ethical dilemmas that result (or rather, not).

Even with that though, we still wouldn't be able to answer the question, save maybe legally. Because, while all that tech is improving, we're really no closer to creating an empirical definition for consciousness than we were back when Socrates and Zeno were debating the subject.

>Explain this. You can copy and past it, but you can't transfer or cut then paste?
youtube.com/watch?v=nQHBAdShgYI
This should disturb you.

Copy and paste on the same hard drive should just change the reference, if you copy and paste to a different device then the original is destroyed and a duplicate is created

Well, the advantage that the Star Trek teleporters have over most sci-fi teleporters, is uses a "matter-energy transfer beam" - so it's at least using the same energy that made up your body, creating a quantumly entangled consistency. (Not that they don't occasionally violate this in the series.)

It's, in some ways, not that different into chopping you into 5 bits, cryo-storing them, and then reassembling them elsewhere later, save that there are more bits involved.

But a teleporter than just disassembles you, sends you as data, and recreates you elsewhere, has killed you and made a copy, yes, and has no limit to the number of copies it can make, provided it has the energy to produce the matter and the data remains intact.

So what you saying is. I died, but I didn't even know it because a copy of me took my place and acted like nothing happened.

>Computational power and scan resolutions are going nowhere but up
Will we ever be able to run an emulated brain at full speed or are we going to end up with something like emulating Ratchet & Clank on the PC? How would a consciousness react to being run at 70% speed with the occasional hundred millisecond hiccup when it thinks of something complicated? Our computational abilities are improving but nowhere near the rate they used to. We could very well end up hitting a wall sometime in the next twenty years.
>Because, while all that tech is improving, we're really no closer to creating an empirical definition for consciousness than we were back when Socrates and Zeno were debating the subject.
Agreed.
>(Not that they don't occasionally violate this in the series.)
Probably why the guy in the video explains it as a disassembler-sending as data type affair. I've never actually watched star trek but it's useful to illustrate the point so the confused user could understand.
My question is, does it really matter? If I've had breaks in consciousness already, then I'm not the "original" me, so to speak. I have no way of telling if I am the consciousness I think I am or another instance of it. As long as two copies never interact, does it matter? The second copy will happily continue being me none the wiser, even if I were dead strictly speaking.

This may well be one of those instances where ignorance is bliss.

Essentially, yes. It's not that your copy acts like nothing happened, your copy has no idea that anything has happened. Doesn't change the fact that you're dead though.

No the real you died, you are just a poor imitation of the original

That's depressing. Fuck, teleportation if this is the end result.

>That's depressing. Fuck, teleportation if this is the end result.
Do you think your copy would have cared if we hadn't told you? Do you think it would have noticed? The concept of consciousness is so weird that it's really hard to truly understand the consequences of it.

...

>We could very well end up hitting a wall sometime in the next twenty years.
Actually, we already have the raw computational power of a human brain readily available, even on a relatively small scale network. Even if we didn't - if the simulated input is also running at 70%, I doubt it'd know the difference. Granted, there's no advantage to it at that point, save maybe as a medical diagnostic device, or resolving certain legal motive issues.

>My question is, does it really matter?
Well, there's a line of cause and effect connecting you, if the matter-transfer beam is involved. Most people are less disturbed by the idea of being moved from one place to another (even in bits), than they are at being destroyed and reproduced elsewhere.

Granted, there's the other question - if you can create a remote copy, without destroying the original, and bring it back the same way...

>a poor imitation of the original
Actually, a perfect copy, save in quantum location. Seeing as how, in the scope of the series, replicated matter has diagnosable flaws, and teleported people do not. It's still not the original though, save maybe when we're being consistent in using the original energy, and it's not a teleporter accident plot twist episode.

>Actually, a perfect copy

I know I was just venting at my future copy for when he remembers this

Then the copy isn't flawed, it's just you.

Garbage in, garbage out.

Nah it's not. The guy who turns up on the other end of the transport is you. For instance, since this is scifi anyway, imagine that I stopped time and disassembled your body atom by atom, then re-assembled it in place before restarting time.
'you' would technically have been destroyed and recreated, but when put this way it seems less bad, even though it would be functionally the same as reassembling you in another spot. This indicates to me that the fear people have of teleporters is irrational.

>Most people are less disturbed by the idea of being moved from one place to another (even in bits), than they are at being destroyed and reproduced elsewhere.
It used to disturb me until I started thinking of it. It's not too disconnected from the idea of death in general, which has never been all that unnerving. If another being continues on thinking it is me then what difference does it make? If I am dead I am dead and can have little say on the matter. If consciousness works in some mysterious way and I'm not dead but simply the continuation of my consciousness in copied form then again it makes no difference.
>I know I was just venting at my future copy for when he remembers this
That's like giving a verdict of manslaughter to a parent who did everything they could to save their child but still failed. The crushing guilt it will experience for the rest of its life will be enough, do you really see a need to punish your copy further?

>For instance, since this is scifi anyway, imagine that I stopped time and disassembled your body atom by atom, then re-assembled it in place before restarting time.
But that only works if your teleporter is sending your atoms, not converting you to energy and data, sending that and then converting that back into "you".

Regardless, creating a digital copy certainly isn't the you that's sitting there reading this. It's a copy. The fleshy "you" will still be dead.

Well if death doesn't bother you than it isn't a problem at all, obviously, but the same could be said of a whole lotta other things, and it certainly isn't the norm.

>But that only works if your teleporter is sending your atoms, not converting you to energy and data, sending that and then converting that back into "you".
Nope. Consider that I could stop time, record in detail the arrangement of your atoms, disassemble you into a random pile, then get a different bunch of atoms and, referring to my notes, reassemble you and then restore time. Your matter-pattern has been copied, you were destroyed and reassembled from different matter, and you wouldn't care at all.

He wouldn't care at all, because he'd be dead.

...His new copy wouldn't care at all, unless maybe you pointed this out to him. I suspect most of these newly created people would shrug it off as defense mechanism though.

If it's a copy of me then he won't feel guilty, he'll be laughing it up the cunt

You still don't get it do you?

You are not the atoms which comprise you, but rather the pattern they fall in.

>Nah it's not. The guy who turns up on the other end of the transport is you. For instance, since this is scifi anyway, imagine that I stopped time and disassembled your body atom by atom, then re-assembled it in place before restarting time.

In this scenerio I'm perfectly fine with teleporting as its the original me all that happened I was broken down.

So, if you use those same notes to make 20 copies of me, instead of just one, they're all me?

Seems to me you'd have one guy taken out of existence for 20 copies of himself.

Plus, location is part of the quantum data that makes up the current me. So you've got to change something in the process.

That's much more of a grey area and comes back to this whole idea of a consciousness. How do you know that it's the same consciousness? Because it told you? Would a copy of the original, while leaving the original intact, not tell you the same thing?
>Well if death doesn't bother you
Should it bother me? I am alive now and quite enjoying the experience, bonus points if I can make others enjoy the experience too. If there is nothing after life then I will not know that I am dead so it will not bother me. If there is something after life then I will still be conscious and my death should hopefully not inconvenience me too much. Either way, there's not much I can do about it but try to postpone finding out for as long as possible.

>But it isn't

This, if somebody made a perfect copy down to every detail do you now have control over both bodies?

>Plus, location is part of the quantum data that makes up the current me. So you've got to change something in the process.
Well sure, but do you feel like you've died each time you take a step and replaced yourself with a copy of you?

>So, if you use those same notes to make 20 copies of me, instead of just one, they're all me?
Yes, basically. This only strikes us as weird because instinct, not because of any logical reason.

>Well sure, but do you feel like you've died each time you take a step and replaced yourself with a copy of you?
Yeah, but back to Zeno - there's a connection between those state changes.

Not so if you assemble me from other matter and energy elsewhere.

Works with the Star Trek transporter matter-energy transfer, as you're working with the same energy. But in this assembly model, the entanglement is broken.

No because you are the atoms and you remain those atoms, especially when you realise the brain doesn't really regenerate unlike the rest of your body

>but rather the pattern they fall in.
Again, that doesn't account for this vague concept of consciousness. It's so incredibly undefinable and difficult to try pin down that we really don't know how this would work.
>Well sure, but do you feel like you've died each time you take a step and replaced yourself with a copy of you?
See the Theseus and Cutty Sark analogy. The stream of consciousness is an important concept here.

Say I froze time, disassembled your child, assembled another one of him with different atoms, and restarted time
Then I tell you "if you kill this copy of your kid I'll reassemble the original one with original atoms and everything" would you do it?

I meant consciousness

Well I'd have an emotional connection to the child even as a duplicate so would say no, I had no involvement in the original deconstruction and it was instant for me so the fact that you had murdered him would not be relevant at the time, to then create another would murder the second and create a third anyway

Well, when having the option of not killing children, I tend to roll with that, so it's a bad conundrum.

Maybe better to ask, if your child died, and you could re-assemble a perfect copy of as it was some minutes before, without the inevitability of it dieing again, would you do it?

...and I suspect most would answer yes - but it would still, technically, not be the same child.

Even taking consciousness out of the equation, say for a soda-can, one has to ask, "Is this copy of the object the original object?", and one would be forced to answer "no", regardless of the perfection of the new copy assembled from different matter.

>so the fact that you had murdered him would not be relevant at the time, to then create another would murder the second and create a third anyway
This. You've killed one stream of consciousness and you're offering to create a third if I kill the second.
>Even taking consciousness out of the equation, say for a soda-can, one has to ask, "Is this copy of the object the original object?", and one would be forced to answer "no", regardless of the perfection of the new copy assembled from different matter.
Even if the original, crumpled soda can could be recovered and it was re-melted and put back into the original form, would that not simply be a copy using the original materials?

No matter how you word it we can safely state that its unreasonable to believe that perfect duplicates would share a conciousness, so how is that any different from disassembling and reassembling with the same parts

If the original soda can was somehow perfectly recycled to and from into its original state, you can say that's the same soda can - it's just changed states from melted to whole again, or what not.

Which is why, with the matter-energy transfer beam model, you can say that's the same person - he's just changed states.

But if you remake the soda can, or the person, from other matter or energy in another location, then it's, by definition, a copy, and not the original, regardless of precision. You would even be able to prove this, objectively, if you had the ability to measure on the quantum level and could thus trace their origins.

>No matter how you word it we can safely state that its unreasonable to believe that perfect duplicates would share a conciousness
Right. So I feel confident in asserting that your definition of consciousness is essentially indistinguishable from that of a soul.

Well, they wouldn't share a consciousness, they would simply each have their own. (Barring supernatural stuffs.)

What you are talking about is a soul, I see it as a computer process permanently destroyed upon ending then when you start it up again it's the same application but the original process is gone forever

When we create AIs they're going to look at these concerns of ours and judge us as fucking pussies.

They'll judge us as whatever we allow them to

Ew... That's a potentially nasty route... As the only thing that can tell the difference between one instance and another of the same MSWord with the same "memories.doc" loaded up at different times on the same system, beyond the clock time and file access dates, is outside the system.

Its even worse when you consider if conciousness.exe ends when you go to sleep then dreams.exe starts, meaning every time you sleep you die and are replaced when you wake up

>Judge us as fucking pussies
Then we need a good dicking from them to get over this fear huh?

Seems enough of it keeps going to give you the sense of time passing...

...Unlike surgical anesthesia, which messes with your sense of time for weeks, in addition to being a truly odd experience.

But I was aiming more for a "only God could know" type conundrum, and in this case, you're God. (Thus the comparison kinda flawed.)

>Then we need a good dicking from them to get over this fear huh?
I think the fear would be much less if we had an asexual reproduction mode like (some?) sharks.

Sharks are actually pretty skittish - and evolution would be a whole lot slower.

How much of our bodily functions are controlled by our brain but not our conciousness though, it hardly weakens the argument

>only God could know
Well, the PID would probably change, so not really... More, only God would care.

>WTF messed with my computer!?

But yeah, the comparison is kinda bad.

What's a better comparison, a quantum mind that can inhabit multiple bodies? A soul that will always return to the original body?

It's more of a procedure of ownership though.

If you're killed, and a copy of you made elsewhere, consciousness may continue, but it's no longer yours, it's the copy's.

If you're broken down to energy and transferred somewhere else only to be reassembled though, you can at least claim you retain ownership, as you are in fact the same object, just having changed configuration temporarily.

Posession is 9/10ths and all that.

But your conciousness could be considered as owned by your brain as it basically hires you to look after the body it lives in

But you're assuming that what turns up at the end is a copy, and therefore not you, and therefore that you are dead, and therefore that what turned up was only a copy, etc.
Its entirely circular.

I'm saying, from a legalease perspective, you died. Someone just like you is now wandering the universe, but there's no ties between the two of you, aside from the fact that you're identical.
Identical twins, raised in identical circumstances, are not legally the same people, and if one dies, neither ownership nor consciousness is transferred.

How is it circular? Seems pretty unidirectional. You died, copy spawned, end of story.

>I'm saying, from a legalease perspective, you died. Someone just like you is now wandering the universe, but there's no ties between the two of you, aside from the fact that you're identical.
>Identical twins, raised in identical circumstances, are not legally the same people, and if one dies, neither ownership nor consciousness is transferred.
Except that by the time of birth (when personhood is conferred by law) each twin has accumulated sufficient variation of experience and structure that they are obviously distinct. At the last and first moment of you and the copy's 'existence' your experiences and physical structure are precisely identical.
A technicality I suppose.

>You died, copy spawned, end of story.
Well I would disagree. You didn't died because your 'copy' is actually you, end of story.

Seems like there'd be a case of super inheritance then though.

Well, if I clone you, dump all your memories into the clone, age and tone the clone to match, and DON"T kill you, is that clone you?

If this teleporter just copies you at a distance, without killing you, is that copy now you?

I mean, what about killing the original suddenly makes the copy you?

Nuttin, it's still a copy - living its own life, presumably with its own consciousness, from that day forward - the only difference is it's still alive.

With the crazy way the law works, if this sort of teleportation ever became a thing, there'd have to be a law that willed your estate to the copy. (Which would actually be great in a sci-fi, if it was just something no one ever talked about due to that nagging suspicion.)

>If this teleporter just copies you at a distance, without killing you, is that copy now you?
Yes, of course it is.

You can't have two (you)'s. You've got you, and your copy, and they can immediately get into a debate as to which one gets to keep dating Sally and owns the house.

going by ockham's razor I'd say yes.

we're just biological machines, only our current tech stops us from duplicating it towards a non-bio machine.

So a consciousness should be possible inside a machine, be it a digital or a physical one.

this is some high quality bait

>reverse baiting
here's your (you)

>You can't have two (you)'s. You've got you, and your copy,
What is your copy but another you, over there instead of here? Another you means two yous, by definition. He would be you and you would be him, there's no difference.

Until you have an empirically verifiable definition of consciousness, you can't say, one way or the other. You can only guess, and maybe make legal statements as to the machine's rights.

It'd be easier to define and make a test for sapience I suppose... This has never really been thoroughly defined either, but at least people have made some pot-shots at it with signing chimps and apes.

The only empirical definition for consciousness we have is medical, and isn't what we're talking about here - though, I suppose the machine ain't ever going to pass that one.

Hah, I remember that happening on a very large scale in a webcomic. In a related event, one guy became a giant demographic by himself. There was some social turbulence to say the least, but they didn't dwell on it much unfortunately.

The difference is in origin, in addition to chronological, and if you created the copy on the other side of the room, instead of on another planet, you can bet your ass this is the first thing the original would bring up when it came to Sally.

>a test for sapience

some humans, however rare they may be, are bound to fail it
it's just an open door to /pol/ tier politics

> you can bet your ass this is the first thing the original would bring up when it came to Sally.
Only if the cultural and instinctive tendencies of people didn't shift with the new technology. Really you'd just clone sally and both get her in this scenario, and think nothing more of it.

Well, we do accept that children (and yes, some mentally handicapped adults) lack certain reasoning abilities at least, and you can test for those... Still, way closer to having something to work with than you are for consciousness. (Which is "wut does that even mean?" tier.)

how can time travel be real

By jet lag.

By necessitating the need to clone Sally, you're still delineating between yourself and the copy.

...and again, in the soda can scenerio... The newly created duplicate soda can is not the original - you don't even need to bring consciousness into this. If someone else picks up the soda can copy and claims it to be his, is the original also now his? No. There's an inherent difference between originals and spawned copies.

>..and again, in the soda can scenerio... The newly created duplicate soda can is not the original - you don't even need to bring consciousness into this. If someone else picks up the soda can copy and claims it to be his, is the original also now his? No. There's an inherent difference between originals and spawned copies.
If you insist. Personally if you have two objects identical in physical structure, to me that means they're the same.

Also I think a society which developed teleporters would rapidly adopt it and think little of your qualms. From an evolutionary PoV as long as your genes continue existing its all gravy. People would eventually not have an instinctive fear of teleporters.

Oh, I'm not denying that mankind wouldn't bend over backwards to rationalize the convenience of even this sort of teleportation...

...But to say physically identical things in different places are the same thing is just objectively folly. There's physically identical objects all over the damned universe.

>...But to say physically identical things in different places are the same thing is just objectively folly.
But you have no issue identifying your past self as you, when in fact your past self is more different from you than a teleported clone. This is illogical.

Some smart way should be devised like copy the brain inside machine - run it, then connect the two brains together let subconscious mind run on original while the aware part is run from computer, so the computer is aware of the organic one - then just kill the organic one while that's running, also enabling the subconscious of digital.

Again, there's a link in state changes involved and objective at the quantum level. Even barring consciousness, again, if you melt down the soda can and remake it from the same stuff, it's a state change in the material, not a new object. If you, on the other hand, make a new can using data and material elsewhere, it's a new can, regardless of how perfect a copy it is.

Will the organic consciousness be in the computer because of this?

If not you failed.

You can't reverse the entropy of a melted can.

Basically it will make you feel as if you never died but rather that you suddenly jumped inside a computer.

Otherwise you would just simply cease to be as your original brain will eventually die, without really experiencing your digital copy.

If you have a bunch of nanites that slowly convert the biological brain into a mechanical one, then you may have a Ship of Theseus type of issue, but otherwise...

That is not relevant nor how that works, when we're talking about atomic rearrangement of matter into a duplicate object.

Idk man I think your thought on the matter is being clouded by growing up in a christian society which has a strong cultural belief in a unique soul that occupies each human body

>entropy
>melting
failed at physics 101 ?