Reminder that the future is in augmented intelligence, not artificial intelligence

Reminder that the future is in augmented intelligence, not artificial intelligence.
Improving what we already have is far more practical than rebuilding from the ground up.

Robots can eat shit.

Other urls found in this thread:

spectrum.ieee.org/biomedical/imaging/ai-designers-find-inspiration-in-rat-brains
twitter.com/NSFWRedditImage

>Reminder that the future is in augmented intelligence, not artificial intelligence.
And how do you propose that is going to work? Now instead of just developing AI systems you have two problems: developing AI systems AND interfacing them with really complicated biology.

Pic related is you.

>developing AI systems
See, you don't.
Your new job is simply to understand and alter function.

you are wrong. the only way we will ever truly understand ourself and how we work is when we can rebuild it

Bullshit, you can build logical models to predict, recognize patterns, and engineer technology without having a complete theory of the universe.

not at all, it's likely that our brains are not physically equipped to add new information inputs to. At least not without inducing brain damage and/or insanity. Most likely, no matter what we do the brain won't derive anything from a new input because it's biologically not equipped to do so. Especially if it's in a format incomprehensible to us. At most we can augment our current senses and directly fuck around with some parts of the brain.

not at all, rebuilding is overly complex. Our brains are compounds of millions of years, very convoluted, not some direct system running on a clear-cut theory framework. We will never do so, because it's inefficient and wasteful.

never mind, i misread, i agree - while this has some merit regarding our brains, it is not really possible. They're too complex, we need a rebuilding-level understanding.

first, we need a philosophical understanding of how we work. even simple things like
>how do we recognise objects
>how do we map functions onto objects
we do not understand.
like said yes we can recognise the pattern that the function of a cup is to lift it up and drink, but how we actually get to that point is a different ballpark.

The future is in using synthetic bio/nanotechnology to manipulate our brains to simulate virtual realities.

Traveling the world from your home. Having sex with AI slaves. Yes, the possibilities are endless.

I'm not hearing why we can't reduce portions of the brain to their functional or multifunctional components and work around it.
Simple changes in development/accidental trauma, genetics, can lead to things like synthesthasia, sociopathy, etc.
The brain seems fairly easy to fuck with, and there are certainly many redundant components if we can survive without portions of it.

Ok, but humans are a biomechanical machine already, so why would you want your iPhone installed in your skull? Wetware is a phychotic idea.

Sure, but claiming that we don't know how something functions is different from asserting that it isn't fungible when there's plenty of evidence to the contrary.
Furthermore, as user already said, once the brain is understood and replicable, there's no reason it can't be redesigned to be that way.
If we're at that point of conception of genuine artificial intelligence, there's no reason to just give it to a foreign body instead of reaping the benefits ourselves.

Why would you not want to be able to process and sort information as quickly as a machine?

that's reductionistic view onto the world. there is so much more beyond the material layer than we perceive or understand.

>once the brain is understood and replicable
and that will not happen until we can rebuild one. if ever.

>and that will not happen until we can rebuild one. if ever.
Already doing it with rats.

and how is that going?

spectrum.ieee.org/biomedical/imaging/ai-designers-find-inspiration-in-rat-brains
Well enough, as far as can be said.
Don't you find it interesting that so much AI development revolves around copying what we already have and not developing abstract algorithms?

>thinks brain can't handle new information inputs
>tfw neuroplasticity

they will find many interesting things but not more understanding about intelligence and the deeper working of anything. I only repeat what I stated earlier: we need an philosophical understanding first and then link that with the neuro science findings.

>Don't you find it interesting that so much AI development revolves around copying what we already have and not developing abstract algorithms?
not at all. all we need to know is already here and we just have to understand it. it's the least abstract thing you can think of. it's the very essence of being we want to reproduce.

Mechanicals will only get us so far.
I think one day we're going to have focused on the biological more than anything else.

Rather than prosthetics limp transplants.
Rather than computers in our heads genetically improved brains.
Upgrade the species not shove a bunch of metal bits into it that will always wear out and never merge well.

>we need an philosophical understanding first
I don't think that's true.
There are plenty of mechanical functions which the brain can be reduced to, and even simply improving those functions is enough to substantially improve human cognition.

For instance, the idea of studying these rat brains is to understand how the rats' brains navigate complex terrain, and then to implement that in a machine. I think it stands to reason that this would also mean you could alter how the rat traverses if you simply reverse the process.

Take a portion of the brain which is exceeded by modern computational abilities and work how the computer does it into the biological function.

That work would be harder than finding a theory which derives its behavior imo.

>what is drugs

>For instance, the idea of studying these rat brains is to understand how the rats' brains navigate complex terrain, and then to implement that in a machine.
and they will fail because navigating complex terrain isn't just something that one small part of the brain does but the whole rat. Furthermore, to navigate terrain you need to have a way of perceiving it and make sense of it. and we are back to the old problem of
>how do we recognise objects
>how do we map functions onto objects

every meme word that is used like ai, deep learning, machine learning etc is literally just some statistics and a more or less complex function that gives an output that again needs hand holding by humans.
and that's a naive and a wrong approach in the long term in my opinion.

You can't augment intelligence without an artificial intelligence to give you the "road map" to how the brain fully functions in the first place, so, checkmate, meatbag.

bump