Unnamed comments on How To Lose 100 Karma In 6 Hours -- What Just Happened - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (214)
All four examples involve threats - one party threatening to punish another unless the other party obeys some rule - but the last threat (threatening to increase existential risk contingent on acts of forum moderation) sticks out as different from the others in several ways:
I'm not entirely in agreement with the first three threats, but they're at least within the bounds of the kinds of threats that are commonly acceptable. The fourth is not.
And 5. Ridiculousness. "He threatened what? ... And they took it seriously?"
(Posted as an example of a way this is notably different to the typical example. Note that this is also my reaction, but I might well be wrong.)
My bet would be that he believes that it is proportional. From where I'm standing, this looks like assigning too much impact to LW and to censorship of posts. Note that 2 and 4 are particularly good arguments why something of this nature was dumb regardless of importance.
Re #1: EY claimed his censorship caused something like 0.0001% risk reduction at the time, hence the amount chosen -- it is there to balance his motivation out.
Re #2: Letting Christians/Republicans know that they should be interested in passing a law is not the same as hostage taking or harming someone's family. I agree that narrow targeting is preferable.
Re #3 and #4: I have a right to tell Christians/Republicans about a law they're likely to feel should be passed -- it's a right granted to me by the country I live in. I can tell them about that law for whatever reason I want. That's also a right granted to me by the country I live in. By definition this is legitimate authority, because a legitimate authority granted me these rights.
Citation? That sounds like an insane thing for Eliezer to have said.
After reviewing my copies of the deleted post, I can say that he doesn't say this explicitly. I was remembering another commenter who was trying to work out the implications on x-risk of having viewed the basilisk.
EY does say things that directly imply he thinks the post is a basilisk because of an x-risk increase, but he does not say what he thinks that increase is.
Edit: can't reply, no karma. It means I don't know if it's proportional.
Nod. That makes more sense.
One thing that Eliezer takes care to avoid doing is giving his actual numbers regarding the existential possibilities. And that is an extremely wise decision. Not everyone has fully internalised the idea behind Shut Up and Do The Impossible! Even if Eliezer believed that all of the work he and the SIAI may do will only improve our existential expectation by the kind of tiny amount you mention it would most likely still be the right choice to go ahead and do exactly what he is trying to do. But not everyone is that good at multiplication.
Does that mean you're backing away from your assertion of proportionality?
Or just that you're using a different argument to support it?