Can the human mind ever be replicated digitally?

Can the human mind ever be replicated digitally?

Other urls found in this thread:

scientificamerican.com/article/why-is-turings-halting-pr/).
backchannel.com/the-myth-of-a-superhuman-ai-59282b686c62
youtube.com/watch?v=HMepbHaxVwc
twitter.com/SFWRedditGifs

Digitally no, but a synthetic construct capable of recreating the complex chemical interactions between neurons is conceptually possible

And why do you think that it is digitally impossible?

Well, impossible is maybe an overstatement, but it would require a flat retarded amount of processing power. The logistics of the system, even if boiled down to simple action-potential emulation and controlled in binary, having to process literally billions of these interactions consistently on an electronic processing system would kill it. Even allowing for the heat to be dissipated, the RAM requirement would be prohibitive. It gets even worse when you consider that human brains don't work according to machine logic and our information storage and retrieval systems are disasters according to normal programming conventions, to say nothing of how our brains interpret sensory information.

Better to directly adopt the specialized organic electrochemical process directly, since we're much closer to being able to do that through recent breakthroughs in producing human organs in a lab.

No. We still have no idea how the human brain actually works, not too mention all the complex interactions between the brain, the body, and the environment.

At best, you could theoretically scan the brain, mapping each individual neuron,and upload it into some sort of supercomputer (far far beyond our current technical ability) but that wouldn't actually be a replication- it would be some sort of weird neural network. It wouldn't have the memories, or the emotions, the hormones, the bacteria, etc. It would be a very hollow, very inefficient sort of calculator- thing.

If by "digitally" you mean with conventional computers (even if they are 10000x times more intricate and powerful than what we have today), we would hit some technical problems such as halting (e.g scientificamerican.com/article/why-is-turings-halting-pr/). Basically, a machine such as the ones we currently build cannot decide if a certain input will cause it to go into an infinite loop or not - i.e. no robot or AI whatsoever built in the way we currently do it, will ever be able to prevent itself from eventually getting stuck as it comes in contact with new data.

There is also the problem of body: people already report significant psychological changes after being amputated or somehow handicapped, let alone if we build a "person" that doesn't even have a proper body (existing "digitally" as you put it). Would it, even if we could build it, think, behave or even recognize itself as a human mind if it did not have a humanoid body?

Finally, there is the matter of computation requirements such as mentions. It is probably easier to just scrap the "digital" thing altogether (at least as we think of it nowadays) and think up hardware that more closely mimics organic processes.

Depends what you define as 'digital'.
Whether you're talking about organic brains, semi-organic brains or non organic brains the answer is yes.

The 'mental functions' the human brain/mind possess can be replicated.
Other intelligent life could have consciousness or their form of it and could even be non carbon based (contentious though).
Whether we could comprehend how they think remains to be seen, the issue of qualia applies to a human speaking with another non human intelligence (not just human to human).

It's easier to use definitions and semantics of 'mind' as a conceptual map to better understand brain state functions (there is only the physical though, there is no remainder).
It stands to reason that any physical system could be sufficiently complex enough to start to generate a mind (mental characteristics: self/other, consciousness, higher order cognition, self awareness etc..).

There's no reason to doubt a non organic system could not replicate what work is being done by neural firing patterns / neuro-transmitters.

The more interesting / better questions are:
0. What form does intelligence/consciousness take when it exists purely in the virtual
1. What is intelligence / how does one define mental states reflective of intelligence / consciousness
2. How do you resolve qualia
3. What do intelligence look like in different evolutionary development set ups (and what happens if something takes this into hyper-drive, i.e. taking direct control of the evolutionary machinery itself, making it an intelligently controlled process)
4. How do intelligent beings communicate meaningfully when their entire minds could be fundamentally different in both structure and workings (yet have the same outcomes).
5. If a machine / non organic being develops consciousness, does it require a "life-bias", can it reach our level of consciousness without a "life-bias".

here,
Really awesome to see some serious answers on this subject, I've been interested in the idea of alternative means of synthetic life and what that means for transhumanism for a while now and it's nice to know my thoughts on the matter aren't completely off-base (though I clearly have a lot more to look into)

transhumanism is too soon. it's like twelfth century monks talking about sailing to the moon in a wooden ship.

it's weird how many problems science fiction has brought to our attention that won't happen for hundreds of years, and how many it has ignored that are affecting us right now.

> worked on the 2012 census
> nine months of a great income
> 2016 they did it on the web
> they didn't employ any of us

my god, i've been replaced by the web.

based on current knowledge, it looks unlikely to me

I think most of this talk about computer consciousness has two flaws: first, it falls victim to the Chinese Room argument—that the computer is merely using clever wordplay, which we taught it, to mimic consciousness—secondly, it is based on this Cartesian—or Christian theological—assumption that the self/mind is unique and separable from the human body which is probably not the case. I would argue that an independent, computer consciousness might be possible, but an individual transcending his body is not.

It already is

For the most part there's no need to replicate it digitally. It would probably be easier to make something better.

also this:

backchannel.com/the-myth-of-a-superhuman-ai-59282b686c62

it's easy to see the Singularity people as a kind of religious fanatic.

wow. you might be interested in a not-dune novel by Frank Herbert: "Destination: Void". he addresses exactly this issue, although in the book they fall back on the good old "let's copy a human mind" thing mostly.

the result is sufficiently different for it to ruin their day.

I believe the foundational problem of AI is that it must work within programming. Someone would need to write the code for it, and that would necessarily limit its ability. It always has to stick to the rules of the programme, not matter how much effort you put into it, or how much variability it has. Even if it was written by another computer, that must be working within limits, and so the entire premise is based on a limited foundation.

The human mind is not a computer. It has plasticity to it, and infinite variability. You are not limited in how you think, or how you interpret the world. Children will happily imagine their toys can talk, adults can invent an entire new world for a book, artists can paint such abstract visions that any possible interpretation is correct. I do not think all this - the most analogue of outcomes - can be replicated with a digital system.

Plus, it can go horrendously wrong, as the variety of mental health issues shows, and which sometimes give us the best outcomes. Many brilliant artists, painters or scientists had/have noticeable mental health issues.

I was thinking about this sort of stuff a few weeks ago. If we could somehow clone and upload our mind onto a computer and try to talk to it, we would not be talking to the same person. This is because our bodies hormones have a massive affect on the way we think and act. This is why I don't believe a person is just a mind, but a fusion of mind and body. If you take away either the mind or body you kill the person.

i am robot beep boop lmaoooo

Quantum computing would allow for a spectrum of possibility per computation, but it's very far off and would probably make the 'mind' seem schizophrenic

Again, you could just emulate the functions of neurons and reduce everything to binary, but to emulate accurately would have exreme drawbacks

That book is so shit.

Because discussing the problems we have right now isn't an easy thing to do. When people discuss the future, they are thinking with whatever emotions they have on hand towards their small, small lives and trying to project that out to millions of outcomes. That's just foolishness. The precise reason the transhumanism discussion is important is because the technology that we already have is causing difficulty and a vast amount of cultural change, we have to be prepared mentally for what's to come.
A.I. we have today is already more advanced than this.

>I believe the foundational problem of life is that it must work within the framework of DNA/RNAs. It would need to have a code for it (A/T/C/G), and that would necessarily limit its ability. It always has to stick to the rules of the programme, no matter how much effort you put into it, or how much variability it has. Even if it was written reproduced by another lifeform (evolved), that must be working within limits, and so the entire premise is based on a limited foundation.

Literally no. Because analog systems can perfectly model things that digital systems cannot, since they involve things like irrational numbers, which are dealt with in only an approximate fashion.

With the bioengineering industry going strong, I highly doubt there is any reason to reinvent the wheel here. The future may be synthetic, but it will also be organic.

youtube.com/watch?v=HMepbHaxVwc

Theoretically it is possible

Socially we are incapable. Unless there is a massive change in thought allowing for excellence to be achieved through mass cultural and ethnic cleansing and subsequent eugenic protectionism we will not be able to move forward into such demanding fields.

Btw the word digital is a limit, obviously a synthetic organ can be created, there may be other better alternatives to come. Synthetic materials and obscure conductors can open new oportunities.

That's not how DNA works you mongoloid. Your analogy is bad and you should feel bad.
Plus, at least one synthetic replacement base protein has already been created in a lab (that we know of publicly)

No. What makes the human mind special is its ability and incredible desire to understand other human minds.

the hormonal effect would need to be simulated also. the stomach bacteria would have to be simulated as well, because they can influence mood and the immune system too.

hell, you might end up having to simulate the entire body.

>A.I. we have today is already more advanced than this.

we do not have AI today. we have a lot of people claiming that their computer learning system is AI. we have a lot of people claiming their chess program is AI. we have even more people claiming that AI is something different from what people thought it was last week.

we do not have a program that can emulate a human mind, so either tell us what you think AI is, or go back to jerking off over diegetic prototypes.

This
Intelligence is an important part of Artificial Intelligence. The ability to mimic and copy is not intelligence, even if it can pass a limited Turing Test.

Seeing as we cannot even explain consciousness, no. It's more likely that matter is a product of consciousness and not the other way around.

Hence "artificial"

So matter affects us in a way that demands a consciousness to deal with it? Or at least consciousness is a good tool to deal with matter?