I do consider the effect of shifting public perception on an existential risk issue by a tiny bit to be worth lives.
So you are ready to kill people in order to shift the public perception of an existential risk issue by a tiny bit?
I never claimed to be a complete utilitarian. For that matter I wouldn't push fat men of bridges.
As far as the Wikipedia policy goes, it a policy that just doesn't matter much in the grant scheme of things. For what it's worth I never touched the German Quantified Self that contained a paragraph with my name for a long time.
I do however have personal reasons for opposing the Wikipedia policy as Wikipedia gets the cause of death of my father wrong and I can't easily correct the issue as Wikipedia cites a news article with wrong information as it's source.
Sh...
There's a big Hollywood movie coming out with an apocalyptic Singularity-like story, called Transcendence. (IMDB, Wiki, official site) With an A-list cast and big budget, I contend this movie is the front-runner to be 2014's most significant influence on discussions of superintelligence outside specialist circles. Anyone hoping to influence those discussions should start preparing some talking points.
I don't see anybody here agree with me on this. The movie has been briefly discussed on LW when it was first announced in March 2013, but since then, only the trailer (out since December) has been mentioned. MIRI hasn't published a word about it. This amazes me. We have three months till millions of people who never considered superintelligence are going to start thinking about it - is nobody bothering to craft a response to the movie yet? Shouldn't there be something that lazy journalists, given the job to write about this movie, can find?
Because if there isn't, they'll dismiss the danger of AI like Erik Sofge already did in an early piece about the movie for Popular Science, and nudge their readers to do so too. And that'd be a shame, wouldn't it?