Concerning General Artificial Intelligence

The idea of General Artificial Intelligence is facinating to me. However, I am skeptical as to whether it will be achieved within our lifetimes. If you listen to futurists and philosophers such as Ray Kurt Kurzweil or Nick Bostrom lecture, or even read there work, you might be highly convinced of the ease at which we will soon be (are?) able to regularly emulate and exceed human level intelligence in a computer. The reason for my skepticism is three fold
> 1. The fundamental mechanization of intelligence/sentience on a microbiological level are still not fully understood.
> 2. The methodology and structure of the human mind into stratified hierarchical "nodes" as postulated by both mentioned authors as well as others is also not clearly understood.
> 3. Most, if not all, estimations of the space required to store all human memories have been achieved - However the speed of human thought relative to the complex conclusions we draw with limited information have yet to be matched in my opinion.

Now I see these as being a fundamental barriers in developing GAI for several reasons.
> 1. Emulating our own minds, arguably the most complex things we know to exist, is probably the path of least resistance to true GAI.
> 2. Neural networks function in a stratified, node like system modeled off of neurons but this retains the same problems mentioned before.
> 3. The speed of computation problem is quite frequently dismissed with mentions of the singularity and """Moore's Law""" both concepts I am extremely skeptical of seeing as neither is probable, rather just mere speculation on the future.

That's my take. Idk if Veeky Forums talks about AI but I've thinking about the possibility that there could be a GAI somewhere out there. That being said, I don't think it's IMPOSSIBLE, just unlikely.

What are your thoughts?

...

> implying artificial intelligence is /x/
Way to add to the discussion user

What is general ai and whats moores law. Do we really need to underrstand the brain for a.i?

>Do we really need to underrstand the brain for a.i?
Having some understanding of what a consciousness is might help us create one, yes

What do you mean by consciousness? Something like us or jist somin really smart? That can learn/recognise things?

Im just thinkong cant neural networks do the job if they big enough complex enough? The brain is complex after all.

The ability to think like a human. The ability to be creative, to think in abstraction and metaphor, to identify incongruities as easily as humans do. We're going to use it to solve problems so it can't be a slave to its programming. The ability to be adaptable, think outside the box and logically source and piece together the necessary data it needs to solve problems is essential

> Moore's Law
The assumption that computer power will double every year based on the shrinking of transistors, which is already facing subatomic limits.
> General AI
Theoretical artificial intelligence with the ability to master and learn tasks similar to the way humans do. Where as a self driving car uses task specific AI to drive, a general AI uses its mind to solve novel and abstract problems.

That's the implication of the work done by the people mentioned above as well as others. However, neural networks as we know today are very complex yet specialized. They are dedicated to specific tasks, i.e. driving, fixing photos, facial recognition, search result optimization, and contextual linguistics. All these tasks are incredible in their own right but pale in comparison to the theoretical general AI