army1987 comments on I've had it with those dark rumours about our culture rigorously suppressing opinions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (857)
I didn't find the idea that scary or dangerous at least any more than Pascal's wager. But I also have this creepy meta-feeling that I really desperately want to believe that so I'm risking less than I would be if I did find it dangerous/plausible/scary.
I found it isomorphic to Pascal's wager, at least assuming that people who fail to be Christian solely because they've never heard of (or seriously thought about) Christianity in the first place won't go to hell.
I've thought about the idea enough to realize that (assuming one takes it seriously at all) the above is not guaranteed.
Well, people who failed to be Christian because they lived before Jesus ended up in limbo, according to Dante. I'm not sure if that's based on any actual theology.
IIRC, the current stance of the Church is the reverse of that: atheism is a sin if you've heard of the idea of God but you refuse to think seriously about it, but not if despite thinking it through you still can't believe it.
Can you source that?
I think I read that in Youcat where it talks about the first commandment, but neither the Google Books nor the Amazon previews contain that part of the book.
I was actually referring to the basilisk.
You mean that gung onq guvat zvtug unccra rira gb gubfr jub unira'g urneq be gubhtug nobhg gung fpranevb?
Yes.
That doesn't sound plausible to me, but if you're right, the right thing to do would be letting as many people as possible know about the issue, so that it's more likely to be averted.
The way it works is: if people are keeping the basilisk a secret for the sake of protecting others (even if it increases their own punishment), that means that those people value protecting others over their own safety. Therefore, a more effective way to punish them, is to torture those they're trying to protect.
Are you sure you don't want to at the very least rot-13 that? Some people here have explicitly said they'd rather not find out what the basilisk is.
In Newcomb's a good agent will 1-box in emulator and 2-box in reality if it could tell apart sim and reality. Even a tiniest flaw in the emulation results in lack of incentive for following through with the basilisk threat. You need a very dumb decision theory for the agent to just torture people for no gain.
I hope the downvotes of the parent are for taboo violation and not for content. When it comes to Roko's Basilisk specifically (considering potential spooky acausal variants separately) Army's solution is correct. With the caveat firmly in place I don't believe even Eliezer would disagree with that. If he did then I would have to seriously reconsider my support for SIAI - it would indicate that he is someone who is likely to actually implement (or support the implementation of) the Basilisk's glare.
That is certainly not consistent with his behavior.
I indeed suspect that someone is just downvoting all posts mentioning the basilisk regardless of content. (As for “[T]hat doesn't sound plausible to me”, this is slightly less true now than when I wrote that post -- see http://lesswrong.com/lw/2ft/open_thread_july_2010_part_2/64f2.)
Consider using the term "Roko's Basilisk" for clarity.
Do you mean “not guaranteed that, given that hell exists, people who have never heard of it won't go there”, or “not guaranteed that, given that hell exists and that people who have never heard of it won't go there, it is equivalent to [the thing that should not be mentioned]”?