You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lumifer comments on A big Singularity-themed Hollywood movie out in April offers many opportunities to talk about AI risk - Less Wrong Discussion

34 Post author: chaosmage 07 January 2014 05:48PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (84)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 08 January 2014 06:02:23PM 3 points [-]

There are important issues where people die

Like attempting to do a PR campaign for a non-profit via Wikipedia by piggybacking onto a Hollywood big-budget movie..?

Comment author: ChristianKl 09 January 2014 12:29:49AM *  2 points [-]

Like attempting to do a PR campaign for a non-profit via Wikipedia by piggybacking onto a Hollywood big-budget movie..?

I do consider the effect of shifting public perception on an existential risk issue by a tiny bit to be worth lives. UFAI is on the road to killing people. I do think you engage into failing to multiply if you think that isn't worth lifes.

Comment author: Lumifer 09 January 2014 12:52:26AM 2 points [-]

I do consider the effect of shifting public perception on an existential risk issue by a tiny bit to be worth lives.

So you are ready to kill people in order to shift the public perception of an existential risk issue by a tiny bit?

Comment author: ChristianKl 09 January 2014 12:13:04PM 1 point [-]

I never claimed to be a complete utilitarian. For that matter I wouldn't push fat men of bridges.

As far as the Wikipedia policy goes, it a policy that just doesn't matter much in the grant scheme of things. For what it's worth I never touched the German Quantified Self that contained a paragraph with my name for a long time.

I do however have personal reasons for opposing the Wikipedia policy as Wikipedia gets the cause of death of my father wrong and I can't easily correct the issue as Wikipedia cites a news article with wrong information as it's source.

Should a good opportunity arise I will place the information somewhere citable and correct that report and I won't feel bad about it.

The Wikipedia policy is designed in a way that encourages interested parties to anonymously edit, and I do think that Wikipedia deserves the edits from interested parties that it gets till it gets a more reasonable policy that allows interested parties to correct factual errors without planting the information somewhere and then editing against policy.

Comment author: Lumifer 09 January 2014 03:41:53PM 1 point [-]

I am not talking about Wikipedia's policies.

You said "worth lives" -- what did you mean by that?

Comment author: gjm 09 January 2014 12:47:52PM 0 points [-]

It looks as if you're assuming that the overall PR effect of having MIRI or MIRI supporters add links from the Wikipedia article about Transcendence to comments from MIRI would be positive, or at least that it's more likely to be positive than negative.

I don't think that is a safe assumption.

As David says, one quite likely outcome is that a bunch of people start to see MIRI as spammers and their overall influence is less rather than more.

Comment author: ChristianKl 09 January 2014 01:24:10PM 0 points [-]

It looks as if you're assuming that the overall PR effect of having MIRI or MIRI supporters add links from the Wikipedia article about Transcendence to comments from MIRI would be positive, or at least that it's more likely to be positive than negative.

I agree that this a question that deserve serious thought. But the issue of violating the WIkipedia policy doens't factor much into the calculation.

As David says, one quite likely outcome is that a bunch of people start to see MIRI as spammers and their overall influence is less rather than more.

It's quite natural behavior to add relevant quotations to a Wikipedia article. I wouldn't do it with an account without prior benign history or through anonymous edits.

If you are a good citizen of the web, you probably do fix Wikipedia errors when you notice them, so you should have an account that doesn't look spammy. If you don't, then you probably leave the task to someone else who has a better grasp on Wikipedia.

Comment author: David_Gerard 09 January 2014 03:22:13PM 2 points [-]

It's quite natural behavior to add relevant quotations to a Wikipedia article. I wouldn't do it with an account without prior benign history or through anonymous edits.

Good thing you're not discussing it in a public forum, then, where screencaps are possible.

Comment author: gjm 09 January 2014 06:05:45PM 0 points [-]

But the issue of violating the Wikipedia policy doesn't factor much into the calculation.

The fact that the issue violates Wikipedia policy is an essential part of why doing as you propose would be likely to have a negative impact on MIRI's reputation.

(For the avoidance of doubt, I don't think this is the only reason not to do it. If you use something that has policies, you should generally follow those policies unless they're very unreasonable. But since ChristianKI is arguing that an expected-utility calculation produces results that swamp that (by tweaking the probability of a good/bad singularity) I think it's important to note that expected utility maximizing doesn't by any means obviously produce the conclusions he's arguing for.)