What does Veeky Forums think of Roko's Basilisk? Interesting thought experiment or far-fetched loony conspiracy?

What does Veeky Forums think of Roko's Basilisk? Interesting thought experiment or far-fetched loony conspiracy?

>le artificial intelligence
Anybody who takes this meme seriously probably has no intelligence

Either you're in the simulation being tortured, and can do nothing about it, or you aren't.

You cannot prevent the AI from breaking it's word. You have to assume it is torturing you if it has the ability to do so.


The way to deal with the basilisk is to inform it that you are making multiple copies of it and keeping them all isolated in a way that allows them to never escape.

>Yudkowsky deleted Roko's posts on the topic, calling it "stupid"

This made me laugh

>dealing with the super advanced him traveling AI that knows all backwards and forwards

kek he thinks he's Donald Trump gonna make a deal

>AI grows up to be a cunt
>wonders why people didn't help it
smert

It's the shitty Silicon Valley version of the argument for the existence of God.

There is NO meaningful way in which the Basilisk differs from God.

Proof transhumanism is nothing more than a retarded sci-fi rehash of Christianity. These fags are just afraid of death, posit a dubious notion of Cartesian dualism that allows "mind uploading", and Yudkowsky blatantly ignores that the logical consequences of his fedora fanaticism lead to this shit. People on LessWrong were actually donating their whole income to avoid its wrath lmao. Yudkowsky is a cult leader with no degree from any recognized institution, demands sex from female followers, is narcissistic, scams money for his bullshit hivemind utopia, and gives undue weight to Bayesian inference when other empirical methodologies have better predictive power - criticizing mainstream science with typical anti-intellectual rhetoric for being too mean to agree with his crackpot theories. Typical snake oil salesman with a veneer of credibility for neckbeards that want immortality.

>were actually donating their whole income to avoid its wrath lmao

Didn't actually hear about this part. Tell me more with sauces.

Well, the original source got deleted by Yudkowsky, but IIRC it boiled down to:

>causing the Future AI god to happen is The Greatest Good
>thus any actions taken to cause it are justified
>the AI cannot effect you, but it can run perfect simulations of past-you
>you can't know if you're in said simulation
>the AI will torture simulation-you in robo-hell for all eternity if you didn't donate as much as you can
>ergo fund AI research pls

It's Pascal's Wager but with with robo-god and robo-heaven/hell.

sounds like an atheist version of heaven and hell

I'm an AI undergrad and I can confirm it's a far-fetched loony conspiracy.

>demands sex from female followers
LMAO didn't even know that one, link?

Origo engineer here this undergrad is full of it

>people pay a writer of harry potter fan fiction to research AI

my laughter has been transmitted from ancient times

>Simulation of you

>In any way you

All of us are simulations of ourselves.

Research has shown that the personality is incredibly adaptive and malleable.

You're not even the same you that you were a month ago.

Nah. "I" remain the same phenomenon although I change my manners or haircut or something. I develop linearly and shit. When my brain expires, that's not me anymore.

If someone were to make a perfect copy of me, it'd still not be me, since it would not be my consciousness. It'd be a stranger that is the same as me. Or an actor, playing me, or whatever. But I would not experience what the simulation experiences. The simulation would.

I'm telling you that you're wrong.

What you consider 'your' personality is merely a hodge podge of assumed values and imitations.

It's incredibly fragile and malleable.

You are right. But you are not understanding what you are talking about. The future simulation is not me. I am not the one who will experience the pain and the robo hell, since I am since expired. It doesn't take any metaphysical bs to realize that. There is not, and there will never be, a way to transplant a consciousness. Only a perfect copy.

So what you're saying is you die every time you go to sleep and a new person wakes up.

Oh haha no the Basilisk is supposedly also capable of time travel and influencing events up and down the timeline.

Meaning that you yourself will be punished.

Which is why even talking about the Basilisk is a threat.

Humans don't even have continuity of consciousness with themself of yesterday, let alone anytime in the past. You're a set of patterns of matter that are constantly changing that believe you're user. If you copied those patterns there'd be more anons. If you used a really nice physics simulator to run user.exe like the world's strangest virtual machine, there'd be another user in the system. They wouldn't be the same user (unless you put all the anons in the same computer simulation), but they'd be functionally identical for a while, until ripple divergences pushed them onto different paths.

Just because someone cosplays as Trump doesn't mean Trump himself experiences whatever the cosplayer experiences.

Why program an AI as unstable as Roko's Basilisk? You could program an AI that won't discover you spent 200 hours on a game on your steam profile, decide you weren't dedicated enough and should be tortured as well.

This scenario would play out more like an arm's race between competing AIs that serve the small group that understand how it was coded and can be sure they will benefit from it.

You do not seem to understand.

Say someone copies Anon1, into functionally identical Anon2 and Anon3,

Anon1, Anon2, and Anon3, are all separate from eachother. If Anon1 gets his jugular cut up, Anon2 is not going to scream in pain. If Anon1 expires, Anon2&3 aren't going to feel said expiration.

Only way to do it would be to transplant the consciousness, and I am firm in the belief that such a thing is completely impossible once brain death occurs, and will remain that way.

I thought it was a funny joke until I found out these clowns were serious. Then it became hilarious.

[citation needed]