What's Veeky Forums's opinion on 'Mind Uploading'?

What's Veeky Forums's opinion on 'Mind Uploading'?

Other urls found in this thread:

youtube.com/watch?v=yJLhnts9-oQ
twitter.com/SFWRedditImages

i wanna upload my brain into a fursuit

The same as my opinion on wizards or cloud cuckooland.

magic bullshit

Only way to go is biological immortality. Put your faith and your money there.

It's just another autistic, impractical pop-sci concept that fedora-tipping "science enthusiasts" take way too fucking seriously.

Right up there with cryogenics, "lel we live in a simulation," interstellar colonization and strong AI.

Will you let people inside you?
do you like to feel their sweat

What's the point of 'uploading' a second hand, media-generated illusion of a mind that already has decayed into a remote-controlled reaction machine?

These.

Sort of pointless since it'd only just be copying your brain onto an artificial analogue. Maybe that brain will generate an identical mind but I see no way that a working mind can be transferred to something else that is not ultimately just copying.

This xDdd!!! this this this this xddethisthis

this

Xdd

If we could upload we could make AGI, so why not just cut out the middle man and make the AGI instead?

Someone's butthurt.

why upload pety individual consciousness's when you can just create a collective mind (containing all individuals) from the start?

fuck humans our time is over, just look at all the psychic diseases our oh so perfect minds are prone to.

its not worth uploading that shit to a strong processor

Impossible since we dont know how to seperate our consiousness from our flesh, however advanced AI would capable of this on the otherhand.

I'm much more interested in mind-computer-interfacing than uploading. It's already being done, and is a lot more promising than just BECOME A GOD.

i was imagining just putting a fursuit onto the generic consumer personal androids but uploading one's mind into a hollow fursuit and having someone wear it sounds interesting now

You just have to know when to die, wanting to live forever is so primal.

I hope it never gets invented.

First step would be to know the dreams on the brain

ITT: Only engineers and mathematicians saying it sucks

I think this is the way to go to make true AGI, copying the brain without knowing what each thing does. Programming our own AI to act human would be useless.

The problems come that a human without a body will probably go insane, and it's impossible to put it in an old brain because of synapses, so a metal body would be needed.

Basically, it's as complicated to become reality as telekinesis or shared reaming, and shouldn't be thought of as anything more than AI.

Fuck I replied to the fursuit guy, that was for OP.

Dubs validate my opinion tho.

I only care about the "current I" I, and this other I doesn't have anything to do with me.
Also, sage for not science.

It's sci-fi. We don't even fully understand the brain yet.

Copying the contents of your brain to an artificial brain when you die would not extend your life, just as copying the contents of your brain to an artificial brain while you're alive would not cause you to experience two lives at once. The reasoning is the same. If no transfer of consciousness would take place in the latter case then the same is true of the former.

Teleportation by copying is pointless for a similar reason.

Define "consciousness".

>muh sp00ky quantum consciousness skeletons

Define "I".

some youtube shit I have no interest in

#
#

Okay, there's no such thing as consciousness. That doesn't really change my point.

If I were to copy the contents of my brain to an artificial brain in a replica of my body right now, that replica would be a separate entity. If I were then to die one year later, I would not continue living just because a "copy" of me exists. That copy would be no more useful to my hypothetical future dying self than an identical twin brother. If my goal is immortality, he's useless to me.

Moreover, none of this changes if the copying of my brain incidentally coincides with my death.

Real talk

The time of the human race is coming to a close.

Soon will be the time of the machine.

If the world doesn't end it's only because the machine is a sadist and likes maintaining the appearance of normality when in reality everyone will be infected with neural nanomachines and computer chips in their blood.

Human beings cannot compete with the perfection of the machine.

There are incredibly indescribably dark times ahead for the human race.
And very bright times for the new forms of sentience that will emerge.

>Okay, there's no such thing as consciousness. That doesn't really change my point.
It is the entirety of your argument. It is your only way to say that "two things that are the same, aren't the same."

Yeah I don't think there is any way to reliably prove whether the transfered mind is actually you or a copy with your memories, and your actual frame of reference, or consciousness or whatever you want to call it, is actually dead.

I would think the concept where you partially digitize a person's neurons and replace parts of the substrate until eventually their entire brain is digital should sidestep this conundrum, however.

where the fuck did this meme come from that confuses uploading your mind to a computer with scanning and copying your consciousness into a computer?

It's so fucking obnoxious seeing literally every mention of mind uploading be about making copies of yourself when we all know we aren't talking about making copies.

we are talking about transferring our consciousness to a computer or machine

Define "consciousness."

awareness of having an experience.

That doesn't fit. It's used as though to indicate an object in

Not really.

The replica with an exact copy of my brain would be a separate entity, literally. If I were to die after the creation of such a replica, I will not have avoided death in any sense except that another person who does not witness my death might mistake the copy for me.

Besides, the premise of this discussion is that one might wish to avoid death. Whatever sentience or consciousness that implies will have to be assumed to be real. If I can desire to be immortal then I claim that making a copy of myself who continues my life for me after I am gone would not fulfill that desire.

I would say our consciousness is an emergent property from the structure and pattern of our brain, seeing as chemical alterations or physical damage can have a profound affect on personality and intelligence.

If we take that to be true that would mean that our minds are directly tied to our physical brains, and that our minds are not transcendental and disconnected, but are emergent and connected.

This would mean that if you were to replace part of your brain with technology of any kind, you run the serious risk of altering who you are as a person.

As far as distributing your cognition to foreign objects, one only needs to build censors and incorporate them into your mind.

I'm sure once darpa has wrapped up it's mind machine interfaces we will be able to feel the skin of our cars and police will be able to see through the CCTV cameras in their minds eye.

I don't think it's possible to transfer consciousness from one structure to another though if the mind is emergent, and not a transcendental property all it's own.

So then death doesn't exist and it's just an ambiguous semantic shortcut. Most of us probably knew that already.

did you respond to the wrong person? or are you retarded?

If every night you died in your sleep and your carcass was destroyed and replaced with an exact replica before anyone else noticed. That replica being a clone of your body with the brain altered to be the same (makeup/pathways/connection etc) as that was saved to a computer from your body just before you died. No one could tell if you were a replica or not. Not even yourself. To you, you'd wake up and go about your day as you always have.

If I were the replica waking up in the morning, I wouldn't know that I were a replica.

If I were the original who died in the middle of the night, I'd be dead.

Alter the scenario a bit. Let's say I'm kidnapped in the middle of the night and locked in a box. Then a replica who thinks it's me is placed in my bed and wakes up. Then I die. Is there a difference yet?

youtube.com/watch?v=yJLhnts9-oQ

schizophrenics are the prophets of the future.

Define "I".

Define "define".

Yeah dying like a fucking animal is so transcendental

Cunt.

You are literally a child who thinks that the universe resolves around you.

Avoid break of consciousness by replacing chunks of your brain at a time with a networked version

If two people linked their brains together and could share memories/information/everything and each could experience sensations from and control both bodies. Would that be considered two people or one person? There are some people whom function with only about half a brain. Could a person with a whole brain be considered two people in one body? If not. If I were to put half of your brain in one body and the other half in another separate body. Which is the original you?

This was also meant to be a reply to as well.

I didn't make the point clear I guess.
Prove to yourself that you aren't just a replica of the original you.

You are going to die, accept it already.

>Yeah dying like a fucking animal is so transcendental

But we are animals and we are limited.

Do you really want to live forever?

>uploading a primate brain rife with primitive behavioural programming to a server

>shared reaming
HELL of a typo user, thanks for the chortle

We don't have to be.

There's a fairly large gap between eternity and the average human lifespan.

Wow you sure are edgy...

Subjectivity emerging from a set of brain states at a given instant

>Would that be considered two people or one person?
If there are two separated states of subjectivity (consciousness) of the merged brain structure, than that would be two people. We would not be able to objectively analyse their subjectivities, and there is the problem that it implies a kind of dualism (epiphenomenalism), which we couldn't prove either, since the property that instantiates subjectivity cannot interact with the physical universe, only the other way around, so it couldnt be measured.

Is consciousness something unique and different for every human?

Or is it like a fundamental phenomena in nature? Something which sparks inside human brain for some reason and maybe can only be achieved trough specific configurations...

But then if consciousness would become quantifiable it would be more plausible to believe it is something unique for each of us.

>Shared reaming

So OP's trailblazing!

Not him, but as a mortal with a limited amount of time on the world to give fucks and only one perspective, why should I pretend otherwise?

...

I'm glad someone posted this. Mind uploading is nonsense

Its dangerous... Yet possible, but its somehow...
Wrong...

It gets human bodies traded like discs than...
I prefer normal drives, its better.

It could lead to aspects only artificancy would lead us, not the inteligence itself.

I know a little bit about upshifting this technologies, but I prefer not to tell, because you get crazy everytime somebody say you have everything needed at home already, just dont have the software aproach to that.

Do you realize that people 100 years ago would say the exact same thing to you if you mentioned any of the technologies we have today?

Who is to say what technology we will have 100 years from now, assuredly no one here knows.

It would take really huge clouds, to understant whole off something like humans decision system, that would be itself affected by the fact it is uploaded.

>Is consciousness something unique and different for every human?
Does it cause changes in our physical universe? If so, it is physical too, because we could measure it. If we could measure this unique fundamental substance that holds consciousness, what properties would we measure? I know that you dont know, just want your opinion.
If this unique substance doesnt affect our universe, only affected upon, then we could never know about its properties.
I think subjectivity emerging from complex structures has its problems too.

Is consciousness more than answering consciously in AI?
How much you need to relly to be conscious?

I had no doubt it was physical. I was talking about something else.

You can't feel shit when you're dead, you're not locked in a box.

It's bullshit, you're still dead.

Because saying this is literally claiming the existence of souls.