Artaxerxes comments on xkcd on the AI box experiment - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (229)
I am no PR specialist, but I think relevant folks should agree on a simple, sensible message accessible to non-experts, and then just hammer that same message relentlessly. So, e.g. why mention "Newcomb-like problems?" Like 10 people in the world know what you really mean. For example:
(a) The original thing was an overreaction,
(b) It is a sensible social norm to remove triggering stimuli, and Roko's basilisk was an anxiety trigger for some people,
(c) In fact, there is an entire area of decision theory involving counterfactual copies, blackmail, etc. behind the thought experiment, just as there is quantum mechanics behind Schrodinger's cat. Once you are done sniggering about those weirdos with a half-alive half-dead cat, you might want to look into serious work done there.
What you want to fight with the message is the perception that you are a weirdo cult/religion. I am very sympathetic to what is happening here, but this is, to use the local language, "a Slytherin problem," not "a Ravenclaw problem."
I expect in 10 years if/when MIRI gets a ton of real published work under its belt, this is going to go away, or at least morph into "eccentric academics being eccentric."
p.s. This should be obvious: don't lie on the internet.
Well, I think your suggestion is very good and barely needs any modification before being put into practice.
Comparing what you've suggested to Eliezer's response on the comments of xkcd's reddit post for the comic, I think he would do well to think about something along the lines of what you've advised. I'm really not sure all the finger pointing he's done helps, nor the serious business tone.
This all seems like a missed opportunity for Eliezer and MIRI. XKCD talks about about the dangers of superintelligence to its massive audience, and instead of being able to use that new attention to get the word out your organisation's important work, the whole thing instead gets mired down in internet drama about the basilisk for the trillionth time, and a huge part of a lot of people's limited exposure to LW and MIRI is negative or silly.