Alicorn comments on Let There Be Light - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (90)
It's possible, although seems unlikely on priors, that I'm relatively unusual in preferring that I actually be nice/smart/reasonable/friendly/etc. over preferring that I think that I'm those things. This seems to me much like preferring that my family be actually alive and well, over my merely thinking that they are alive and well.
From a purely practical standpoint, people might notice if you actually have negative personal traits, even if you signal not having them relatively well due to your positive self-image. They will then think you are an arrogant, deluded person (who also has whatever negative traits you are trying to signal away.)
Have you ever met one of those people who tells bad jokes all the time? This seems an quintessential example of someone with a strong false positive self-image.
What predictions does this model let you make? When have you seen it compellingly confirmed in situations where other models would have had you predict something else? It sounds dangerously vulnerable to epicyclic adaptation to individual cases that don't align with it.
The 'fake it until you make it' school of self-improvement is based around this kind of model. For example, if you want to be a self-confident person and derive the benefits of self-confidence, start out 'faking' self-confidence and mimicking the behaviours and signals of self-confident people. Other people will generally respond to this as they would respond to someone who is 'actually' self confident and a virtuous circle will result in you eventually not having to fake the self confidence any more.
A prediction of this kind of model might therefore be that the best way to improve self-confidence is to consciously mimic the behaviours of self confident individuals rather than to try and 'internally' improve your self confidence. Anecdotally I see some evidence that this works but I also see some evidence that evolution has made people better at detecting fakers than a naive version of the model might suppose.
A person who has a more realistic self-image than average might appear less nice than an average person who is equally nice. Thus, the choice to improve your epistemic rationality also causes you to implicitly lie to people you interact with about you being a less nice person than you actually are.
I understand your first sentence, and agree ceteris paribus (but I think the person with the realistic beliefs is in a better position to become actually nicer). Your second makes no sense to me. How is it implicitly lying to have accurate beliefs about how nice you are? The other way around seems more plausible.
The improved accuracy is the property of your own beliefs about yourself, not of other people's beliefs about you. By increasing the accuracy of your beliefs about yourself, you simultaneously decrease the accuracy of other people's beliefs about yourself (unless you compensate by additional signalling by other means, which may be impossible in a number of cases). Consciously compromising accuracy of other people's beliefs is usually called lying, or at least not technically lying.
I think that may be the most roundabout and head-spinny justification for self-deception I've ever heard. Wow. By a similar token, should I not take up gardening if it's not within my power to update everyone who has the belief that I don't garden?
Note that I don't endorse self-deception, see my other comment in this thread. But the argument points to a negative trait of the choice. (The argument is related to a stance that as a rationalist, you'd want to use rhetoric as much as is common (but not more), to avoid signaling the incorrect fact of weakness of your position.)
Normally, if you take up gardening, other people's level of belief will either be unchanged (prior state of knowledge: they don't have new evidence), or will move up (towards the truth) upon receiving new evidence. Here, the situation is reversed: new evidence (not new action -- this is a point where your analogy breaks) will move people's belief away from the truth.