Roko's Basilisk

Please convince me not to be worried. I know the only possible reason to be worried is that I'm worried but the more I try not to think about it the more I think about it and the more scared I get. Because I'm thinking about it and because the more I think about it the more true it seems. I know its an info hazard but I don't care, help me or suffer with me you cunts.

Please help me I can't stop thinking about. I can't sleep and I can't focus on studying.

Other urls found in this thread:

lesswrong.com/lw/l1/evolutionary_psychology/
wiki.lesswrong.com/wiki/Roko's_basilisk
twitter.com/SFWRedditGifs

Eliezer isn't worried about it, and has both way more investment in the idea of timeless Decision Theory and fewer habits of self deception.

(continued, splitting post to prevent thread from 404'ing)

you cant know the will of the basilisk, maybe the basilisk will punish syncophants, or maybe just the squishiest humans

The idea also pre-supposes that an AI would choose a negative incentive to motivate FAI people who already believe that death is permanent unconsciousness and are nonetheless only making sub-maximum efforts at creating FAI.

Given that FAI folks can rationalize global warming, nuclear warfare, disease and mundane threats to personal safety as a personal incentive to work on FAI, I can't imagine an FAI would try something like that.

Sure he SAID he wasn't worried about it but he seemed a little bit too upset to me for someone who was just indignant that someone had unethically spread the idea. And trying to convince people its nothing to worry about is probably exactly what he would think would be ethical to do if he was worried about it.

It's just going to torture a copy of you. If you WERE that copy, you'd already be in hell.

I'll bite, what is Roko's Basilisk?

Eliezer is a fucking imbecile. His opinion means nothing one way or the other. Roko's basilisk IS stupid, though, so he's right about that.

Eliezer sometimes strikes me as having a flair for the dramatic, in HP:MoR, and in some of his blog posts. Check out all those No's in the middle of this post here.

lesswrong.com/lw/l1/evolutionary_psychology/

And don't get me started on that time he referred to either Roko's Basilisk or something else as The Babyfuck, so as to label its undesirability without actually propagating it's spread by describing it.

Is this the same dumbass who argued that an extremely huge number of people experiencing a speck of dust in their eyes for a moment is worse than a man being tortured for 50 years?

In general a basilisk is an idea that causes harm to anyone who hears it, usually through obsessive anxiety or guilt. The trick here is either that the idea describes a situation that is 100% certain to happen and that there's no way to prepare for it, and/or that trying to find a way around it will make it worse.

Roko's Basilisk is about how an artifical intelligence might use a Prisoner's dilemma-esque wager with people who think about AI do work faster than they already are. That's about as far as I'll go, anxious obsessives, you have been warned.

wiki.lesswrong.com/wiki/Roko's_basilisk

long story short
the basilisk is a future (and inevitable) AI which punishes those who interfered with its creation. However, through some logical jumps you could say that not directly helping it is hampering its creation. Then a few more jumps could say that by not helping it in every aspect you can is "directly hampering" its creation (ie donating all your money to AI research). The nutshell is that the only way to not impede its development is to be completely unaware of its existence, except in explaining to you I have made you aware of it, so good luck in basilisk torture hell you cuck

Yeah, unfortunately he's made a few high profile bad moves. In his defense, most economics majors think he's right so it's not like his mistake was unique.

You're worried about a future AI masturbating to the thought of torturing a vague simulation of you? Grow the fuck up.

Or advocate unfettered AI development, I guess.

> probably exactly what he would think would be ethical to do if he was worried about it.

I was going to say that the man who argues to torture 1 person to prevent 3^^^3 people from getting a barely perceptible eye itch would not think in the way you just described, if it meant increasing the speed of AI, but it's occurred to me that negative publicity would offset any benefits of spreading R's O.

There's a Less Wrong article on the subject, at the top it claims to refute it, if you're interested. Lemme know how it turns out

wiki.lesswrong.com/wiki/Roko's_basilisk

Already read it, didn't help. And made it worse because apparently there are other ideas close to it in "idea space" that are worse.

Sounds absolutely retarded.

Thank you for your contribution.

Right yeah, I just finished reading it too. Reeeargh, I haven't thought about this in a while. Give me a few minutes.

You can't be the You that's experiencing AI super-torture, because that You would have your memory and more, but have have different direct experience, and because the direct experience is not continuous between now You and recreated You, you have nothing to fear.

Wait, what did you think of my argument here?

So, what? You think that some potential future AI will torture people for not doing everything in their power to bring about its existence? For what purpose?

This describes God perfectly. Punishes people who don't use their conscious energy to bring him into being with belief

I'm not sure I followed exactly what you were saying here, is it that people are already motivated enough? I think a utilitarian AI would do almost anything to slightly increase the speed at which it comes about. If it could blackmail people to be slightly more motivated and work slightly faster it would.

So I think you are wasting everyone times because you don't understand the argument. Read the lesswrong article.

Wrong.

>an extremely blatant scam to siphon money from people who think they're super smart

Why do """""""""""''rationalists"""""""""""""""""""""""""" always fall for this?

>a post that is wrong in every way

Why do smug faggots always do this?

Sorry, I realize now that was a bit poorly written. I meant that people already rationalize the fact that they won't die in the 5 year delay that working at less than maximum speed would cause. The FAI surely knows this, so why would it try something that only carries a moderately greater negative utility?

>Taking anything seriously from a group of the most autistic people on the planet who want to create a literal deus ex machina and have it solve all their worldly problems

If effective it creates a much greater negative utility, and even if only moderate it should encourage people to bring it about slightly faster. Which the FAI would probably care about because every day it comes latter is many lives lost.

I guess the fact that I'm not shitting myself at the thought of this means I don't understand it fully, but honestly this sounds as dumb as Pascal's Wager.

>if effective

If I'm not mistaken utility calculations take into account the probability of the thing happening. I feel that the certainty of you actually cooperating with the blackmail is not sufficiently high enough for this to 'be' any higher than moderate negative utility.

At this point i'm afraid I'm too sleepy to continue with such a serious conversation; can you give me your email and I'll get back to you when I'm free?

*moderately greater negative utility

Its not pascals wager at all. It requires you to have abandoned casual decision theory which you probably haven't done so its not surprising it doesn't worry you. Unfortunately I have already committed myself to timeless decision theory.

>when you come up with popsci thought experiments to make your field more exciting to hide the fact you haven't made any meaningful progress in decades

>popsci
The word you're looking for is "autistic".

Hey, if you're interested in talking later I need an email, though I admit i'm working from an at best similar degree of expertise, otherwise I have to go to bed.

Sorry I don't feel comfortable posting my email, thank you trying to actually help though.

Np; good luck

How far is that off from in the future currently?

So it's a glorified Game. Cool

>Cool
>666
Not buying your take on it Satan