You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

ChristianKl comments on A big Singularity-themed Hollywood movie out in April offers many opportunities to talk about AI risk - Less Wrong Discussion

34 Post author: chaosmage 07 January 2014 05:48PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (84)

You are viewing a single comment's thread.

Comment author: ChristianKl 08 January 2014 01:15:49PM *  -1 points [-]

I started by thinking that one should wait till the film comes out. No I think that's a bad idea. At the point the film comes out relevant journalists will read the articles that are already published. If those contains quotations from a MIRI person, than that person is going to get contact for further interviews.

It might also worth thinking about whether one can place a quote of a MIRI person to the Wikipedia article for the film.

Comment author: Viliam_Bur 10 January 2014 09:22:48AM 4 points [-]

It might also worth thinking about whether one can place a quote of a MIRI person to the Wikipedia article for the film.

Inserting your quote to Wikipedia, whether directly or through a sockpuppet -- wrong.

Making a quote famous enough (outside of Wikipedia) so that other people can't resist putting it into Wikipedia -- right.

So, the right approach (compatible with Wikipedia's rules) would be e.g. to give an interview to a newspaper shortly after the film comes out. As a side benefit, people who read the newspaper would also get the info. Another possible method is to discuss the movie in a published paper.

In other words, MIRI should publish some texts when the movie comes out, but should not publish them on Wikipedia directly. The texts will get to Wikipedia if they become visible enough outside of Wikipedia. And becoming visible outside of Wikipedia is also important.

Comment author: ChristianKl 10 January 2014 12:16:19PM *  1 point [-]

So, the right approach (compatible with Wikipedia's rules) would be e.g. to give an interview to a newspaper shortly after the film comes out. As a side benefit, people who read the newspaper would also get the info. Another possible method is to discuss the movie in a published paper.

I agree. But it might help to point a trustworthy Wikipedian you know to insert the quote and make the case that the quote should be in the article on it's merits.

By the very definition someone who understands enough about the topic of UFAI to be qualified to decide what quotes should be in a relevant article, has interests in getting the topic represented the right way.

The idea that those interests that come with being qualified about a topic is something that should disqualify oneselves from editing a relevant article is just wrong. If followed it leads to article that contain factual errors because the person doesn't know what they are talking about and just copies what some other source wrote and newspapers tend to be error riden and are published without peer review.

The texts will get to Wikipedia if they become visible enough outside of Wikipedia.

There are plenty of texts out there that are very visible outside of Wikipedia and that have business being cited in Wikipedia but that aren't. The Wikipedia system doesn't work in a way that effectively identifies all suitable texts. I don't think there is anything wrong to help it in an area where you know the landscape.

As far as the topic goes I have no financial ties to MIRI and a lot of people reading here don't have either. The only interest I have that could disqualify me is that I don't want humanity to die. The idea that this is the same thing as commercial spam and that this is a irrelevant motivation for putting a edit into Wikipedia is to me wrong on a fundamental level.

I'm surprised that given the amount of utilitarianism on Lesswrong that sentiment doesn't get a better reception on Lesswrong. Then I guess it's easy to argue in the abstract that one should push the fat man but hard to make ethical decisions in the real life.

Comment author: Viliam_Bur 10 January 2014 01:55:38PM *  2 points [-]

The idea is that if you can't get your quote published outside of Wikipedia, you shouldn't put it in Wikipedia. Preferably, "outside" shouldn't be your own blog, but something more respectable.

The conflict of interest part is (this is my opinion, not Wikipedia policy) just a simple heuristic to prevent most of the "it's not anywhere else, but I insist it should be in Wikipedia" edits.

Yes, it is extremely annoying if a newspaper prints a false information, and some Wikipedia editor insists on adding it to the article, and calls all expert explanations "original research". But still, the heuristic in general is useful -- it would be much worse on average to instead have hundreds of crackpots claiming expertise and "correcting" information from newspapers and books. It is easier to find a third-party volunteer to verify whether X really was published in a newspaper, than to verify whether X is true. -- And you need a lot of volunteers who don't have life and only care about the Wikipedia rules, because if you remove all such non-experts from the game, and leave only the experts and the crackpots, the crackpots will obviously win by their numbers and dedication.

If a newspaper contains an error, the long-term strategy to fix it is to find or create another newspaper article or a book that will correct the error. In general, the battles should be fought outside of Wikipedia, and Wikipedia should just announce the winners.

The only interest I have that could disqualify me is that I don't want humanity to die.

Problem is, there are many people who believe the same. Don't only think: spam, but also: religions, cults, homeopathics, etc. Even if you are right and they are wrong, the difference is only seen by those who agree that you are right. For everyone else, this is just one of literally thousands of different causes that wants to be promoted by Wikipedia. The Wikipedia immune system will evaluate you as a threat.

The Wikipedia system doesn't work in a way that effectively identifies all suitable texts.

Well, this is the place where you have the chance to add the text successfully. The more important and relevant the source of the text, the higher your chance of success.

I'm surprised that given the amount of utilitarianism on Lesswrong that sentiment doesn't get a better reception on Lesswrong.

Okay, let's talk about consequences. You add a MIRI quote to Wikipedia, someone deletes it. You add it again, someone deletes it again and quotes some Wikipedia rule. You add it again and perhaps even say that you consider Wikipedia rules irrelevant in this specific case. The quote is removed again, and now you have a group of people who have no life watching the page all day ready to remove your quote if you add it again. Also, you have increases the probability of other MIRI quotes being removed in the future, even if they are moderately well sourced.

Strategy B: Get the quote in some high-status source. Then add the quote to the article, somewhere at the bottom (obviously it's not a part of the plot, nor the characters and cast, but maybe criticism), with a reference to the source. The probability that it stays there is much higher.

I guess it's easy to argue in the abstract that one should push the fat man

But in real life the fat man has dozen deontologist bodyguards ready to stop you. So instead you listen to the bodyguards; they tell you they only obey the wisdom from newspapers, so you bring them the newspaper article recommending to push the man, and they will happily push him themselves.

Comment author: NancyLebovitz 10 January 2014 06:00:34PM 3 points [-]

Maybe instead of focusing on details of quoting in Wikipedia, we should be looking at how to write things which are sufficiently sharp and interesting that they keep getting quoted.

Comment author: ChristianKl 10 January 2014 02:43:29PM 0 points [-]

The idea is that if you can't get your quote published outside of Wikipedia, you shouldn't put it in Wikipedia. Preferably, "outside" shouldn't be your own blog, but something more respectable.

Yes, having it published outside would be the start. I live in a world where it's easy for me to get things about Quantified Self published in relevant sources. I just have trouble placing something about my father in there to get his Wikipedia article factually correct. While he lived he wanted our press stories without entanglement.

It might be that I underrate the difficult that MIRI has with getting something published 'outside'. I would expect that it should be easy to find a Wired journalist who is happy to write such a story.

But even if you can not find an actual journalist, write the article yourself. Most newspapers do publish meaningful op-eds. I would be surprised if you wouldn't find a newspaper willing to publish it. It's free content for them and MIRI is sort of authoritative, so there no reason not to publish the article provided it's well written.

Getting something published in the Guardian's Comment-is-free is also really easy and might be enough that most Wikipedia editors consider it "outside".

Okay, let's talk about consequences. You add a MIRI quote to Wikipedia, someone deletes it. You add it again, someone deletes it again and quotes some Wikipedia rule. You add it again and perhaps even say that you consider Wikipedia rules irrelevant in this specific case.

I probably wouldn't engage in an edit war. I nowhere argued that you should be stupid about adding the article.

Yes, I do agree that you would want to go the road of getting the quote in some newspaper before you edit the Wikipedia article. Given that the article is about a topic with a lot of interest that what you need to do, to let the edit stick.

Comment author: Viliam_Bur 10 January 2014 06:38:35PM 1 point [-]

I live in a world where it's easy for me to get things about Quantified Self published in relevant sources.

How much is because of the relevance of QS to the sources, and how much is your skill? I mean, if your skill plays an important role, perhaps you could volunteer for MIRI or CFAR as a media person. For example, they would give you the materials they produced, and you would try to get them in media (not Wikipedia, for the beginning).

It might be that I underrate the difficult that MIRI has with getting something published 'outside'. I would expect that it should be easy to find a Wired journalist who is happy to write such a story.

I don't know such details about MIRI; I am on the different side of the planet. You would have to ask them whether they are satisfied with their media output. Maybe they are, maybe they are not. Maybe they consider it a better use of their time to focus on something else (AI research), but would appreciate if someone else pushed their material to media. This is just my guess, but I think it's worth asking. (Specifically: ask lukeprog.)

But even if you can not find an actual journalist, write the article yourself.

This is another way you could be helpful. Again, ask them. But I think that having a volunteer who pushes your material to media, and is good at doing it, is a great help for any organization.

Comment author: ChristianKl 10 January 2014 08:08:50PM 0 points [-]

How much is because of the relevance of QS to the sources, and how much is your skill? I mean, if your skill plays an important role, perhaps you could volunteer for MIRI or CFAR as a media person.

It's difficult to judge your own skill. I was at the right place at the right time and therefore the first person to be featured in German newsmedia. I spoke in a way that was interesting and from there other journalists continue to contact me.

MIRI PR goals are also very different than the one of QS. QS basically wins if you can motivate individiuals to do QS and come to QS meetups. It's not necessary to convince the existing medical system that QS is good. MIRI on the other hand wins to the extend it can convince AI researchers to change their ways and to the extend it gets funders who donate money to it, to increase it's output.

MIRI PR was to takes care to avoid antagonizing existing AI researchers. If I do QS PR I don't want to associate with Big Pharma and can say things that might antagonize people.

I could imagine that I could contribute something to CFAR PR if CFAR would operate in Germany but currently that not the case. CFAR probably benefits from telling the story that it's the new hot thing that much better than the awful status quo.

Should CFAR organise an event in Berlin, I could try to get a journalist to cover it.

But I think that having a volunteer who pushes your material to media, and is good at doing it, is a great help for any organization.

It's not really a matter of pushing. but a matter of forming it in a way that the media wants it. Authenticity matters a great deal and if a journalist would get the feeling that I'm just pushing someone else's statements the kind of work I did wouldn't work as well.

From the mindset it's much more that you have something they want and they have something you want.

A while ago I heard Jeff Hawkins say that the best way to get VC funding who started Palm is to play hard to get. The same thing might be true with regards to media.

In this case I think the film provides a good opportunity for setting up such a relationship for MIRI. Start by by visible at the beginning as someone authoritative who has something interesting to say about the film.

Afterwards I would expect journalists will be reaching out to MIRI and MIRI can provide them stuff that they want. That different than MIRI trying to push something on journalists.

Comment author: Viliam_Bur 10 January 2014 10:15:07PM *  0 points [-]

Should CFAR organise an event in Berlin, I could try to get a journalist to cover it.

They are planning to do a workshop or a few of them in Europe. I don't know more details, though.

Covering the event would be useful for next workshops and for the local LW meetups.

Comment author: David_Gerard 13 January 2014 10:31:16AM *  0 points [-]

CiF might actually be a good place to get an op-ed placed. Note that they happily put a stupid headline on (its byline might as well be "Trolling is Free, Clicks are Sacred") and hack up the text, all while putting your picture on, not paying you and bringing on the faeces-flinging monkeys in the comments (which one should never, ever read). But it might be of interest to them.

Comment author: David_Gerard 08 January 2014 04:54:32PM 6 points [-]
Comment author: ChristianKl 08 January 2014 05:15:46PM -1 points [-]

I'm not talking about payed advocacy.

Comment author: David_Gerard 08 January 2014 05:27:50PM 6 points [-]

Spam is spam. If you want MIRI to be primarily known to Wikipedia as spammers ...

Comment author: chaosmage 08 January 2014 02:13:14PM *  0 points [-]

A Wiki page for an A-list science fiction movie can get 10000 views per day before it is released, will peak immediately after release and then slowly taper off (Example) until it flatlines at around 1000/day (Example). For comparison, the MIRI page gets about 50/day and Technological singularity gets about 2000/day.

So yeah, that'd be an excellent place to link to lukeprog's comment from.

I would expect the Wikipedia page to be tightly monitored by the film's marketers, so any critical comment would have to fully meet the Wiki's relevance criteria in order to survive series of edits and a bunch of us should keep putting it back in if it gets removed.

Comment author: V_V 08 January 2014 04:02:23PM 6 points [-]

Please don't use Wikipedia for advertisement/propaganda.

Comment author: ChristianKl 08 January 2014 04:36:15PM 1 point [-]

There a fine line between propaganda and adding meaningful content that refers the people who read the article to the right resources.

Comment author: David_Gerard 08 January 2014 04:53:05PM 6 points [-]

Wikipedia:Conflict of interest

Please don't do this.

Comment author: ChristianKl 08 January 2014 05:18:39PM *  -2 points [-]

Could you make the case on the basis of utilitarian morals?

By the way, I substantially disagree with the Wikipedia policy as it stands. It prevents me from removing mistakes in cases where I have better information than some news reporter who writes something that's simply wrong. I think citizendium policy on the matter was better.

Comment author: David_Gerard 08 January 2014 05:28:38PM 5 points [-]

Could you make the case on the basis of utilitarian morals?

All spammers can justify spamming to themselves.

I think citizendium policy on the matter was better.

Funnily enough, one of these works and one is dead.

Comment author: ChristianKl 08 January 2014 05:49:00PM 0 points [-]

Funnily enough, one of these works and one is dead.

If you make a claim that Wikipedia works in the sense that it's effectively prevents interested parties from editing articles I think you are wrong.

I think Wikipedia invites interested parties from editing it by providing no ways for interested parties to get errors corrected through open means.

Comment author: [deleted] 08 January 2014 06:32:32PM 3 points [-]

If you make a claim that Wikipedia works in the sense that it's effectively prevents interested parties from editing articles I think you are wrong.

I think he means that Wikipedia unlike Citizendium has managed to create a usable encyclopaedia.

Comment author: ChristianKl 10 January 2014 12:24:45PM 0 points [-]

I think he means that Wikipedia unlike Citizendium has managed to create a usable encyclopaedia.

By making it easy for people to spam it. There are various issues of why Citizendium failed. I'm not claiming that it was overall perfect.

Comment author: ChristianKl 08 January 2014 05:49:44PM *  -2 points [-]

All spammers can justify spamming to themselves.

That no utilitarian argument. I don't see why it should convince me at all.

Take it as a trolly problem. There are important issues where people die and there are issues where one just acts out tribal loyality. In this case I do see no good reason for tribal loyality given what's at stake.

Comment author: Lumifer 08 January 2014 06:02:23PM 3 points [-]

There are important issues where people die

Like attempting to do a PR campaign for a non-profit via Wikipedia by piggybacking onto a Hollywood big-budget movie..?

Comment author: ChristianKl 09 January 2014 12:29:49AM *  2 points [-]

Like attempting to do a PR campaign for a non-profit via Wikipedia by piggybacking onto a Hollywood big-budget movie..?

I do consider the effect of shifting public perception on an existential risk issue by a tiny bit to be worth lives. UFAI is on the road to killing people. I do think you engage into failing to multiply if you think that isn't worth lifes.

Comment author: David_Gerard 08 January 2014 09:08:51PM *  2 points [-]

People are not going to die if you refrain from deliberately spamming WIkipedia. There should be a Godwin-like law about this sort of comparison. (That's quite apart from your failure to calculate the damage to MIRI's reputation if they become known as spammers.)

Instead, see if you can get organic coverage going. Can MIRI get press coverage about the issue, if they feel it's to their benefit to do so? (This should probably be something directed from MIRI itself.) Get journalists seriously talking about the Friendly AI issue? Should be able to be swung.

Comment author: ChristianKl 09 January 2014 12:26:05AM 2 points [-]

Having the wrong experts on AI risk cited in the article at a critical junction where the public develops an understanding of the issue can result in people getting killed.

If it shifts the probability of an UFAI disaster even by 0.001% that equals over a thousands lives saved. It probably a bigger effect than the 5 people who safe by pushing the fat man.

The moral cost you pay by pushing the fat man is higher than the moral cost of violating Wikipedia norms. The benefit of getting the narrative on the article right about AI risk is probably much more valuable than the handful of people you safe in the trolly example.