Mitchell_Porter comments on Open Thread June 2010, Part 2 - Less Wrong

7 Post author: komponisto 07 June 2010 08:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (534)

You are viewing a single comment's thread. Show more comments above.

Comment author: Mitchell_Porter 13 June 2010 07:52:03AM 4 points [-]

Do you ever have a day when you log on and it seems like everyone is "wrong on the Internet"? (For values of "everyone" equal to 3, on this occasion.) Robin Hanson and Katja Grace both have posts (on teenage angst, on population) where something just seems off, elusively wrong; and now SarahC suggests that "the only friendly AI may be one that commits suicide". Something about this conjunction of opinions seems obscurely portentous to me. Maybe it's just a know-thyself moment; there's some nascent opinion of my own that's going to crystallize in response.

Now that my special moment of sharing is out of the way... Sarah, is the friendly AI allowed to do just one act of good before it kills itself? Make a child smile, take a few pretty photos from orbit, save someone from dying, stop a war, invent cures for a few hundred diseases? I assume there is some integrity of internal logic behind this thought of yours, but it seems to be overlooking so much about reality that there has to be a significant cognitive disconnect at work here.

Comment author: cupholder 13 June 2010 05:58:11PM 0 points [-]

Robin Hanson and Katja Grace both have posts (on teenage angst, on population) where something just seems off, elusively wrong;

I've noticed I get this feeling relatively often from Overcoming Bias. I think it comes with the contrarian blogging territory.

Comment author: RichardKennaway 13 June 2010 07:25:23PM 1 point [-]

I get it from OB also, which I have not followed for some time, and many other places. For me it is the suspicion that I am looking at thought gone wrong.

Comment author: Rain 13 June 2010 08:04:28PM *  4 points [-]

I would call it "pet theory syndrome." Someone comes up with a way of "explaining" things and then suddenly the whole world is seen through that particular lens rather than having a more nuanced view; nearly everything is reinterpreted. In Hanson's case, the pet theories are near/far and status.

Comment author: JoshuaZ 13 June 2010 08:19:37PM 1 point [-]

I would call it "pet theory syndrome." Someone comes up with a way of "explaining" things and then suddenly the whole world is seen through that particular lens rather than having a more nuanced view; nearly everything is reinterpreted. In Hanson's case, the pet theories are near/far and status.

Prediction markets also.

Is anyone worried that LW might have similar issues? If so, what would be the relevant pet theories?

Comment author: Larks 13 June 2010 08:41:32PM 2 points [-]

On a related note: suppose a community of moderately rational people had one member who was a lot more informed than them on some subject, but wrong about it. Isn't it likely they might all end up wrong together? Prediction Markets was the original subject, but it could go for a much wider range of topics: Multiple Worlds, Hansonian Medicine, Far/near, Cryonics...

Comment author: Rain 13 June 2010 08:43:23PM 3 points [-]

That's where the scientific method comes in handy, though quite a few of Hanson's posts sound like pop psychology rather than a testable hypothesis.

Comment author: JoshuaZ 13 June 2010 07:44:53PM *  2 points [-]

I don't get this impression from OB at all. The thoughts at OB even when I disagree with them are far more coherent than the sort of examples given as thought gone wrong. I'm also not sure it is easy to actually distinguish between "thought gone wrong" in the sense of being outright nonsense as drescribed in the linked essay and actually good but highly technical thought processes. For example I could write something like:

Noetherianess of a ring is forced by being Artinian, but the reverse does not hold. The dual nature is puzzling given that Noetherianess is a property which forces ideals to have a real impact on the structure in a way that seems more direct than that of Artin even though Artinian is a stronger condition. One must ask what causes the breakdown in symmetry between the descending and ascending chain conditions.

Now, what I wrote above isn't nonsense. It is just poorly written, poorly explained math. But if you don't have some background, this likely looks as bad as the passages quoted by the linked essay. Even when the writing is not poor like that above, one can easily find sections from conversations on LW about say CEV or Bayesianism that look about as nonsensical if one doesn't know the terms. So without extensive investigation I don't think one can easily judge whether a given passage is nonsense or not. The essay linked to is therefore less than compelling (in fact, having studied many of their examples I can safely say that they really are nonsensical but it isn't clear to me how you can tell that from the short passages given with their complete lack of context Edit:. And it could very well be that I just haven't thought about them enough or approached them correctly just as someone who is very bad at math might consider it to be collectively nonsense even after careful examination) It does however seem that some disciplines run into this problem far more often than others. Thus, philosophy and theology both seem to run into the parading nonsensical streams of words together problem more often than most other areas. I suspect that this is connected to the lack of anything resembling an experimental method.

Comment author: RichardKennaway 13 June 2010 08:34:02PM 0 points [-]

The thoughts at OB even when I disagree with them are far more coherent than the sort of examples given as thought gone wrong. I'm also not sure it is easy to actually distinguish between "thought gone wrong" in the sense of being outright nonsense as drescribed in the linked essay and actually good but highly technical thought processes.

OB isn't a technical blog though.

Having criticised it so harshly, I'd better back that up with evidence. Exhibit A: a highly detailed scenario of our far future, supported by not much. Which in later postings to OB (just enter "dreamtime" into the OB search box) becomes part of the background assumptions, just as earlier OB speculations become part of the background assumptions of that posting. It's like looking at the sky and drawing in constellations (the stars in this analogy being the snippets of scientific evidence adduced here and there).

Comment author: JoshuaZ 13 June 2010 09:40:04PM 1 point [-]

That example seems to be more in the realm of "not very good thinking" than thought gone wrong. The thoughts are coherent, just not well justified. it isn't like the sort of thing that is quoted in the example essay where thought gone wrong seems to mean something closer to "not even wrong because it is incoherent."

Comment author: RichardKennaway 14 June 2010 08:24:22AM *  2 points [-]

Ok, OB certainly isn't the sort of word salad that Stove is attacking, so that wasn't a good comparison. But there does seem to me to be something systematically wrong with OB. There is the man-with-a-hammer thing, but I don't have a problem with people having their hobbyhorses, I know I have some of my own. I'm more put off by the way that speculations get tacitly upgraded to background assumptions, the join-the-dots use of evidence, and all those "X is Y" titles.

Comment author: SilasBarta 13 June 2010 07:46:08PM 0 points [-]

thought gone wrong.

Got a good summary of this? The author seems to be taking way too long to make his point.

Comment author: RichardKennaway 13 June 2010 08:12:32PM 0 points [-]

This paragraph, perhaps?

From an Enlightenment or Positivist point of view, which is Hume's point of view, and mine, there is simply no avoiding the conclusion that the human race is mad. There are scarcely any human beings who do not have some lunatic beliefs or other to which they attach great importance. People are mostly sane enough, of course, in the affairs of common life: the getting of food, shelter, and so on. But the moment they attempt any depth or generality of thought, they go mad almost infallibly. The vast majority, of course, adopt the local religious madness, as naturally as they adopt the local dress. But the more powerful minds will, equally infallibly, fall into the worship of some intelligent and dangerous lunatic, such as Plato, or Augustine, or Comte, or Hegel, or Marx.

I think that should go in the next quotes thread.

Comment author: khafra 14 June 2010 10:55:20AM 2 points [-]

Or perhaps the quotes thread from 12 months ago.