Vive-ut-Vivas comments on What I've learned from Less Wrong - Less Wrong

79 Post author: Louie 20 November 2010 12:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (232)

Sort By: Leading

You are viewing a single comment's thread. Show more comments above.

Comment author: Vive-ut-Vivas 20 November 2010 01:45:52PM 4 points [-]

It's probably useful at this point to differentiate between actual beliefs and signaled beliefs, particularly because if your beliefs control anticipation (and accurately!), you would know which beliefs you want to signal for social purposes.

Comment author: timtyler 20 November 2010 02:12:31PM *  4 points [-]

...though it is also worth noting that humans are evolved to be reasonable lie-detectors.

If your actual beliefs don't match your signalled beliefs, others may pick up on that, expose you as a liar, and punish you.

Comment author: saturn 20 November 2010 08:56:25PM 3 points [-]

You can choose to think of signaling beliefs as lying, but that's not very helpful to anyone. It's what most people do naturally and therefore not a violation of anyone's expectations in most contexts. Maybe instead it should be called speaking Statusese.

People don't pick up on the literal truth of your statements but on your own belief that you are doing something wrong. For instance, writers of fiction aren't typically considered immoral liars.

Comment author: sark 20 November 2010 10:29:49PM 0 points [-]

People will agree to fiction not being true, but not to their professed beliefs not being true.

Comment author: timtyler 20 November 2010 09:15:00PM *  0 points [-]

Signalling beliefs that don't match your actual beliefs is what I said and meant.

Like claiming to be a vegan, and then eating spam.

Comment author: saturn 20 November 2010 09:26:15PM *  4 points [-]

If the whole world claims to be vegan and then eats spam, and moreover sees this as completely normal and expected, and sees people who don't do it as weird and untrustworthy, what exactly are you accomplishing by refusing to go along with it?

Comment author: sark 20 November 2010 10:33:18PM 0 points [-]

Some of us have trouble keeping near and far modes separate. People like us if we try professing veganism, will find ourselves ending up not eating spam.

My personal solution is to lie, I'm actually quite good at it!

Comment author: timtyler 20 November 2010 09:32:58PM 0 points [-]

What does that have to do with the topic? That was just an example of signalling beliefs that don't match your actual beliefs.

Comment author: wedrifid 20 November 2010 09:03:21PM 0 points [-]

One could as easily say that it isn't useful to consider lying from the viewpoint of morality.

Comment author: Vive-ut-Vivas 20 November 2010 02:16:53PM *  1 point [-]

And ideally, you'd take that fact into account in forming your actual beliefs. I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you. It's not a good strategy to change your actual beliefs so that you can signal more effectively -- and it probably wouldn't work, anyway.

Comment author: timtyler 20 November 2010 02:22:21PM *  3 points [-]

I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you.

Hmm: Information Hazards: A Typology of Potential Harms from Knowledge ...?

Comment author: Vive-ut-Vivas 20 November 2010 02:35:13PM 0 points [-]

I haven't read that paper - but thanks for the link, I'll definitely do so - but it seems that that's a separate issue from choosing which beliefs to have based on what it will do for your social status. Still, I would argue that limiting knowledge is only preferable in select cases -- not a good general rule to abide by, partial knowledge of biases and such notwithstanding.

Comment author: wedrifid 20 November 2010 09:07:58PM 1 point [-]

I think it's pretty well-established here that having accurate beliefs shouldn't actually hurt you.

Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.

It's not a good strategy to change your actual beliefs so that you can signal more effectively -- and it probably wouldn't work, anyway.

The fact that we are so damn good at it is evidence to the contrary!

Comment author: Vive-ut-Vivas 20 November 2010 09:36:52PM 0 points [-]

I'm not understanding the disagreement here. I'll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn't useful to try to have the most accurate map of the territory?

Comment author: wedrifid 20 November 2010 10:48:42PM 1 point [-]

We are talking about signalling. So for most people yes.