PhilGoetz comments on Great Product. Lousy Marketing. - Less Wrong

16 Post author: BenAlbahari 28 February 2010 09:33AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 02 March 2010 10:47:16PM *  3 points [-]

I'd rather ask the question without the word "sometimes". Because what people do is use that word "sometimes" as a rationalization. "We'll only use the Dark Arts in the short term, in the run-up to the Singularity." The notion is that once everybody becomes rational, we can stop using them.

I'm skeptical that will happen. As we become more complex reasoners, we will develop new bugs and weaknesses in our reasoning for more-sophisticated dark artists to exploit. And we will have more complicated disagreements with each other, with higher stakes; so we will keep justifying the use of the Dark Arts.

Comment author: ata 03 March 2010 08:18:51AM *  1 point [-]

As we become more complex reasoners, we will develop new bugs and weaknesses in our reasoning for more-sophisticated dark artists to exploit.

Are we expecting to become more complex reasoners? It seems to be the opposite to me. We are certainly moving in the direction of reasoning about increasingly complex things, but by all indications, the mechanisms of normal human reasoning are much more complex than they should be, which is why it has so many bugs and weaknesses in the first place. Becoming better at reasoning, in the LW tradition, appears to consist entirely of removing components (biases, obsolete heuristics, bad epistemologies and cached thoughts, etc.), not adding them.

If the goal is to become perfect Bayesians, then the goal is simplicity itself. I realize that is probably an impossible goal — even if the Singularity happens and we all upload ourselves into supercomputer robot brains, we'd need P=NP in order to compute all of our probabilities to exactly where they should be — but every practical step we take, away from our evolutionary patchwork of belief-acquisition mechanisms and toward this ideal of rationality, is one less opportunity for things to go wrong.

Comment author: BenAlbahari 03 March 2010 01:03:13AM 0 points [-]

As we become more complex reasoners, we will develop new bugs and weaknesses in our reasoning for more-sophisticated dark artists to exploit. And we will have more complicated disagreements with each other, with higher stakes; so we will keep justifying the use of the Dark Arts.

This is exactly the chain of reasoning I had in mind in my original post when I referred to the "big if".