Roko's Basilisk

bad choice of wording*

You're missing the essential fact that the friendly space robot from the future is so powerful that it's able to improve human quality of life so much, that torturing you and other indifferent people is ultimately better from the utilitarian point of view. Otherwise, you're right.

Why would it be better to torture those who aren't a threat? And what kind of power would justify going out of your way to make everyone who doesn't believe in it suffer?

Actually, just from typing that out, I have a vague suspicion...

it's the most retarded shit I've ever heard

Don't even waste your time trying to understand it. Go try to understand something that is actually worthwhile.

>the friendly space robot from the future is so powerful that it's able to improve human quality of life so much, that torturing you and other indifferent people is ultimately better from the utilitarian point of view.
True. That REALLY doesn't diminish the religious parallels here, though.

Because if you don't move your ass and go work on the AI, it will be developed later, ergo fewer lives will be saved. Q.E.D.

So it's about preventing deaths that will happen anyways? That sounds like a self-righteous cause to me. Making everyone else suffer because you managed to delay the deaths of a certain amount of people. There would definitely have to be a human being the guide of this kind of AI for it to take a self-destructive route like that.

>So it's about preventing deaths that will happen anyways?
But if the AI has the power to simulate and torture the nonbelievers, it surely can simulate eternal bliss for both the AI researchers, and all the people it saved. Why can't it circumvent the whole issue and simulate people who died before it came about, I have no idea, I'm not from lesswrong.

This is what I thought of when I saw this thread. In the future, memes are so bad they can kill.

Is it even a meme if it can't spread, as it kills the host immediately? Worse than an ebola of memes, to be somewhat honest.