You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

mesolude comments on Open thread, 16-22 June 2014 - Less Wrong Discussion

2 Post author: David_Gerard 16 June 2014 01:12PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gunnar_Zarncke 16 June 2014 04:50:26PM *  7 points [-]

I'm trying to track down a fallacy or effect that was once explained to me and which I found plausible: The idea that whoever has the more complex and detailed mental model of a topic under question wins a discussion about a question - independent of the actual truth of the matter (and assuming no malicious intent).

The example cited as I remember it was about visual (microscope) inspection of blood samples for some boolean factor (present or not). Two persons got the same samples and were trained to recognize the factor one was always told the truth and the other was lied to a certain fraction of times. After the learning period both had to decide on the factor of some samples together. The result: Even though the person who was lied to had the less accurate model he almost always dominated the decision.

The offered explanation was that the lied to candidate had the more complex model (it somehow had to incorporate factors representing the lies) and that led to the availability of arguments (criteria to look for supposedly explaining the difference) which could be used to convince the other person - despite the falsity of those arguments.

Problem is: I can't find any studies or the like supporting this. Do you know of such a model strength effect? I think it is quite relevant as it seems to be behind the ability of liars or rhetorics to convice the audience by making up complex and impressive structures independent of their truth (the truth must just be inavailable enough).

Comment author: mesolude 17 June 2014 03:25:11AM 2 points [-]

Perhaps something like the representativeness heuristic? While more details make something sound more believable, each detail is another thing that could be incorrect.