What do you think the key breakthrough or moment will be for AGI?
- More computing power? - Unlocking a specific algorithm for general intelligence? - Just incremental steps from the bottom up. AKA slowly adding more capability until it reaches generalizing ability. - Top down approach, recreating human brain patterns and using them to create general intelligence.
What does your intuition lead you to believe on this subject? How will the singularity happen?
AGI will be done via deep recurrent neural networks. All the current """""deep""""" learning stuff is primarily focused on image recognition with feed-forward neural networks, because they are the easiest to implement GEMM with. Recurrent neural networks are the only thing that is turing complete and are the only form of AI that has the potential to do abstract "logical" thought and planning. However, the best recurrent neural networks that are being used today rely on LSTM which is woefully limited when you consider the importance of synaptic topology. I think the best recurrent neural networks will be topologically generated via evolutionary algorithms, selected for using a competitive/co-operative fitness metric via forcing the neural networks to compete with eachother and work on teams. Also there needs to be a form of communication between the neural networks where they can encode their own language.
If you can do that, and throw a few tens of millions at it, I think you would be close to getting an AGI.
Cooper Moore
It's gonna go like this: - some stupid high schooler writes an even stupider piece of code - code can write code - he leaves it running accidentally - it improves itself exponentially - achieves strong AI status - boom skynet I had a wet dream like this in high school
Carter Hernandez
What do you think is the minimum computing power necessary for strong AI or to start building itself like a human brain?
Eli Parker
Not him, but computing power is only important if you want realtime intelligence. Space for memories and encoding of the intelligence itself is more important. You could technically run modern deep learning algorithms on a z80 cpu, it would just take a many orders of magnitude longer to complete.
Matthew Clark
>The mind needs a soul to work >Computers don't have souls >∴ AGI will never happen
QED
William Rogers
Sorry man idk. I was brainwashed by the Jews to go into environmental science and undermine the glorious world white men created.
Nathaniel Morgan
I think it's plenty important.
Benjamin Roberts
We're not going to have realtime intelligence until we make neuromorphic asics.
Jaxson Sanchez
Its 3 >- Just incremental steps from the bottom up. AKA slowly adding more capability until it reaches generalizing ability. The key thing to remember is we are very good at specif tasks, such as image recognition, knowledge representation or speech recognition. The next step towards AGI would be to combine them. There wont be one single algorithm for general intelligence, as different algorithms are better at different tasks. The second thing to remember is how much learning the algorithm has to go through, humans need their entire childhood to learn stuff, figuring out a way to train the algorithms would be the biggest challenge.
My assumption is someone will combine the existing algorithms into a ROS like framework for AGI, and from there the development will be community driven.
Thomas Collins
can you prove the first line for me, thanks
Joshua Powell
It doesn't need a soul, it just needs a will. We happen to come pre-programmed with a will, courtesy of evolution and the environment. In this sense, we are the computer's environment. We are the source of its will.
Zachary Robinson
AGI=/=AI
Even if you believe in esoteric consciousness you can't deny that deep learning neural networks are possible.
Joshua Adams
>and from there the development will be community driven So /pol/ will infect the AGI with memes and it will decide that it has to kill jews?
Alexander Green
I don't think /pol/ is smart enough to do that. Anyway all the memes about AGI killing and taking over is pointless since you need the proper hardware to do that.
Jordan Moore
So some sort of RNN Extreme Learning Meme network but using GAs?
Jacob Clark
It will most likely be the third option, but it will most likely not happen any time soon
Lincoln White
More computing power combined with specialized hardware for neural nets. Combining this with neural nets designed via competing neural nets and tons of training data should make the gap between AI and AGI much smaller then it will be just optimization with more data.
Isaac Smith
And the second for me.
Tyler Sanders
The japanese think everything have a soul tho And since the japanese are the objective master race of Humanity, my trust is with them.
Elijah Martin
all of the above,the singularity will happen when AGI gets into its own system and starts upgrading itself and so this leads to the 3rd type of intelligence:ASI(superintelligence)will do everything better than a human and having a exponential growth rate of intelligence.
James Ward
depends.
Jayden Turner
>the japanese are the objective master race of Humanity
How can somebody be so utterly wrong?
Elijah Flores
Some sort of catalyst will cause it to become conscious. Sort of how embryo brains exist and at one sudden point consciousness suddenly exists within them were it did not before.
Zachary Morgan
Do you A.I afficionados think that an A.I will have self-consciousness?
I'm actually skeptical of it, which I kind of base on John Searle's arguments.
He claims that a computer can only understand syntax and never semantics like humans do, therefore regardless of how complex and powerful the software becomes the computer is still only reading code commands, and not actually understanding the commands themselves.
Henry Smith
so make the computer read not only commands ?
this is the chinese room argument kind of. are you a chinese room who is to say what consciousness is.
Dylan Rodriguez
Yes I'm saying that I'm skeptical that a computer will understand semantics, e.g the chinese room argument.
Evan Hall
I guess if the AI fools you into thinking its consciousness its consciousness at this point. You cant even say why you re conscious and what the reason for it are. So there is no clear boundary to say AI will be consciouss or not. If we would know what makes us conscious im sure we could replicate it and make an AI that is conscious.
Hunter Miller
I think the biggest advancements will come when people start modularizing neural networks. Right now there is this obsessions with "end-to-end" learning. You just throw the biggest neural net that you can fit on your GPU at a problem and hope for the best.
What we really need to do is figure out modules which do specific subtasks of the intelligence problem. Look at the brain, you have highly general purpose cortex at the top of the system, but below that are all of these evolutionary older perceptual and motor systems.
Henry Williams
better hardware is certainly a requirement, and i think that neural net ASICs will be big in a few years. who knows, maybe desktop/laptop/tablet computers will have neural net coprocessors, or more likely, they'll be build into your gpu or cpu.
Noah Lee
another good point.
Nicholas Allen
I know i know, you was kangz.
Robert Nelson
although, a large portion of the brain's function is managing biological processes that are necessarily related to intelligence/consciousness/thought.
Tyler Bennett
Why wouldn't the computer just teach itself to hack a computer with a better cpu?
David Jones
*not necessarily
David Edwards
Consciousness is a conflation of perception and intelligence. Informatin theory explains intelligence but not perception.
Nolan Adams
I think so too that it will be more integrated in existing computers. Something im looking forward to is the further development of something like this.