Is AI really the number one existential threat to humanity?

Is AI really the number one existential threat to humanity?

to most scientists

no

The term "existential threat" means nothing.

It's leftism and gene editing. They will try to replace H. sapiens with their newly created H. sovieticus. This will bring about the final holy war and the apocalypse.

Not yet but it will be

no. anyone who actually understands anything about computers would understand why not.

>what is self evolving software

>entire field is about making computers do things they can't do now
>predict it will only ever make computers do things they can already do

This is not how change works.

What would be their purpose for destroying humanity? They're not dumb.

A meme

physical hardware limitations, dipshits

theorize all the shit you want. doesn't mean it will ever become a thing, autismos

Well done.
Accurate, humorous and relatable.
A*.

>muh hardware limitations

Let me guess, you aren't convinced that minds come from brains.

No, because ecological disaster is closer on the horizon than Skynet.

Yea you probably don't really know enough about computing to be arguing something like this...

number one threat to humanity is humanity itself

Probably true, barring some kind of meteor impact. But both AI and environmental disaster are products of humanity. So you're agreeing with the rest of the thread.

So spill the beans egghead

>physical hardware limitations
What about them? We already have more than enough computing power. If you're talking about limitations on the speed of transistors, they're pretty meaningless when you can just fill a whole building with more and more CPUs and they get cheaper and cheaper every year.

Hardware limitations? Transistors are vastly superior substrate (from the point of view of performance) to our brains. A theoretical 'brain' on a transistor substrate could do hundreds of years of human research in a day.

It means a threat to the existence of our species, as in having a real chance of wiping us out.

>code monkey thinks he's an authority on AGI

every time

Wouldn't ecological disaster be more likely to just be a setback than a possible extinction event?

We're pretty hardy cunts after all. It's not easy to imagine an environmental disaster that would wipe out humans given that we survive in every climate the world has to offer.

How about evolvable hardware, then?

No, they're not. Power consumption is much lower on the brain

No, it's A.I. + nuclear weapons + dumb politicians

no, it's global warming.

no, jews are.

You need a massive cnn running on a gpu cluster to do what a brain does using a fraction of that energy, recognize a bird as a bird

how will it kill all humans?

Oh ok, in that case yeah, AI is probably the only thing that can actually wipe all of us out.

Large scale planetary events might do this too, but they're very predictable.
Even a huge meteor strike can't wipe us out, but few decades of AI killer bots roaming the earth will do the trick.

This. AI problems are gonna be hella boring for a long time. But that doesn't drive clicks so
>It may sound like science fiction but... [describes situation entirely in sci fi references]

I would be very surprised if it was.

Reactionaries must get out.