VAuroch comments on Epistemic Viciousness - Less Wrong

55 Post author: Eliezer_Yudkowsky 13 March 2009 11:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

You are viewing a single comment's thread. Show more comments above.

Comment author: VAuroch 09 December 2013 08:00:21AM -1 points [-]

I don't think there's a rationalist equivalent of eye-gouging, so setting up tournament rules should be relatively easy.

In the most obvious ways to test rationality, which is by debate, the various Dark Arts are something similar.

Comment author: TheOtherDave 09 December 2013 03:24:43PM 0 points [-]

In the most obvious ways to test rationality, which is by debate

Wait, what?
I can't quite tell if this is meant ironically.
Debate is far from the most obvious way to test rationality.

Comment author: VAuroch 09 December 2013 08:36:22PM -1 points [-]

The most obvious way to test which of a group of people has more correct beliefs is by convincing others to adopt your more-correct beliefs.

Comment author: TheOtherDave 09 December 2013 09:56:26PM 0 points [-]

OK, I'm now pretty sure you're serious.

So, let me make sure I understand your position. If you believe A, and I believe NOT-A, then on your account all of the following is true:
- The most obvious way to test whether A or NOT-A is by having us debate.
- If you convince me that A, then you have the more correct belief, and are therefore more rational.
- Thus, debate is the most obvious way to test rationality.

Have I understood your position correctly?

Comment author: VAuroch 09 December 2013 10:17:31PM *  0 points [-]

If you believe A, and I believe NOT-A

  • The most obvious way to test whether A or NOT-A is by having us debate.

These parts are wrong. Debate is not for testing whether A or NOT-A is true, it is for testing what the most accurate posterior for Pr(A) is, given the evidence available, and who had better-assigned priors.

The reason debate is the most obvious test of rationality is Aumann's Agreement Theorem. If we debate beliefs, and we are both perfectly rational, we will agree on all beliefs debated by the end of the debate. The person whose beliefs pre-debate most closely match the beliefs post-debate, if the debate was strictly rational (rather than using Dark Arts), was the more rational on those issues, and can be presumed to still be more rational on other issues.

Comment author: TheOtherDave 10 December 2013 01:29:54AM 1 point [-]

OK, thanks for clarifying your position.

So... if you and I debate issue X, and at the end of that debate your beliefs are completely unchanged, whereas mine have changed slightly, then we've determined that you are more rational than I with respect to X, and therefore probably more rational than I with respect to other issues... provided that the debate itself is "strictly rational."

Yes?

If so, two questions:
If the debate was not strictly rational, does the debate tell us anything about which of us is more rational?
Can you point me at an actual example of a strictly rational debate?

Comment author: VAuroch 10 December 2013 01:52:02AM -1 points [-]

As previously mentioned, there are many other things which are better for being convincing but not rational, so an actual rational debate is pretty much an idealized thing. Some of the early Socratic dialogues probably count (I'm thinking specifically of the Euthyphro). I haven't read the Yudkowsky/Hanson AI FOOM debate, it might as well.

Comment author: TheOtherDave 10 December 2013 02:43:38AM 0 points [-]

Ah, gotcha. Now that I understand what you meant by "debate", your position is clearer. Thanks.