A simplification of the basic game theory behind it/an analogy:
A owns a house but has to move at the point in time X. B is interested in the house.
A wants to sell the house to B but B could just wait until A is gone anyways and take the uninhabited house at no cost.
A threatens to burn down the house if B doesn't buy it. Buying the gas for it does costs either nothing or a minimal amount on money (I don't know which version has what kind of effect on the outcome).
So what happens? I don't know. Maybe there is no "solution" for game theory problems of this kind. If B doesn't buy the house because he wants it for free, and A realizes he won't but Ahas to go away anyways, there is no reason for him to still spend the money to burn it down for no personal gain at all.
I don't know but I think B would always "win" here. Retroactive threats don't work because when the decision is made to act on them or not, the reason for the threats to be there in the first place is already gone as the matter has been settled.
Roko's Basilisk
Not meaningless, just not especially helpful.
A rock is a rock.
Which one of you posted this on /b/?
This is a great book.
>he took roko's basilisk seriously enough to wonder why people take it seriously
>lesswrong better than rationalwiki
Consider the following:
You go to sleep, and an alien master-race takes a scan of your brain and immediately starts running the scan forward in time in a simulated reality on their super advanced alien hardware. At the same moment, you wake up from your slumber and go about your day.
Both you and the scan have perceived continuity with your consciousness right before the scan initiated. Both of you "feel" like you're the same person, even if one of you still has the original meat body and one of them lives in a computer now.
The problem with Roko's Basilisk is that anyone who fears it is indicating that they give even the slightest shit about what happens to their copy. They are afraid that someone might rip back the curtains on their current reality, reveal that they're actually in a simulation, and start torturing them, because they know that's the perceived experience their copy would have in the future and know how they'd feel in tha tsituation. So, they behave as though that could happen at any time so as to prevent it from happening to their future self. And that's fucking stupid.
I'd argue that the flashy multicolored strobe lights that give epileptic people seizures are closer to the Langford Basilisk than the McCoullough effect. The key thing with a Langford Basilisk is that the very act of looking at it screws with your mind. The McCoullough effect is trippy but it takes time to work and is not hazardous to your health.
I suppose it's a different degree of memetic hazard. The McCulloch effect is more of a "memetic hack" that is able to get into our brain and slightly alter the way it works.
The fear is that at any moment you may actually turn into the copy, since we can't prove the reality of any moment other than the present.