Salemicus comments on What Is Signaling, Really? - Less Wrong

74 Post author: Yvain 12 July 2012 05:43PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (169)

You are viewing a single comment's thread.

Comment author: Salemicus 10 July 2012 07:18:46PM 32 points [-]

I enjoyed this post.

It also hints at the notion of signaling equilibria. Consider the Helen of Troy example - this is clearly not an equilibrium, because Helen ends up marrying a bankrupt. Soon "spends lots of money on diamonds" will no longer be a signal of wealth, but will instead be a signal of profligacy - as indeed it is where I live. A man walking around in flashy jewellery would be considered low-class, presumably because in the past there has been exactly this signaling reversal.

In a stable signaling equilibrium, the signal needs to be hard-to-fake. This is why easy-to-fake signals are unstable - in the flowers example, the proles can and will catch on, and switch to the upper-middle-class flowers, so the upper-middle-class have to keep moving to stay ahead of them. The same phenomenon is seen in baby names, where upper-middle-class names become prole after a generation.

One thing I would have preferred is a discussion of the positive externalities of signaling, not just the negative ones. For example, if Yvain and lukeprog are both trying to signal their superior intelligence by writing insightful posts, this may get into an "arms race" for them, losing utility. However, the Lesswrong community gains utility overall. I think the externalities of signaling are generally positive in the real world, they only tend to be negative in what are anyway zero-sum games (e.g. begging).

Comment author: wedrifid 11 July 2012 02:49:06AM 14 points [-]

One thing I would have preferred is a discussion of the positive externalities of signaling, not just the negative ones. For example, if Yvain and lukeprog are both trying to signal their superior intelligence by writing insightful posts, this may get into an "arms race" for them, losing utility. However, the Lesswrong community gains utility overall. I think the externalities of signaling are generally positive in the real world, they only tend to be negative in what are anyway zero-sum games (e.g. begging).

You have just summarized "civilisation" in a nutshell.

Comment author: TheOtherDave 11 July 2012 03:25:36AM 0 points [-]

...and still counts himself king of infinite space.

Comment author: Kindly 23 July 2012 01:54:14PM 1 point [-]

Looks like your signaling backfired in this instance.

Comment author: TheOtherDave 23 July 2012 02:18:53PM 1 point [-]

Yeah, I noticed that.

Comment author: Luke_A_Somers 13 July 2012 03:30:43PM 3 points [-]

How does generating insightful posts end up being negative utility for them? Unless they don't LIKE generating insightful posts, which seems doubtful.

Comment author: Salemicus 13 July 2012 03:56:41PM 6 points [-]

I am assuming that it takes effort to generate insightful posts, and that, for sufficiently large numbers of posts, the disutility of the effort predominates.

Comment author: Luke_A_Somers 14 July 2012 12:25:24PM 3 points [-]

I have a very hard time envisioning them being driven by signaling to do far more productive work than they ought to for their own good.

This is because it was specified that the work remain high quality. If you work yourself to the bone (producing negative utility for yourself), the product will be sub-par.

Comment author: Kaj_Sotala 17 July 2012 09:38:56AM *  7 points [-]

Every now and then, I've neglected college assignments due to being more driven to write a post (for LW or elsewhere). E.g. The Curse of Identity lost me some points for a course grade, because I was so caught up with writing it that I didn't go to an exercise session that would have earned me the points.

Of course, whether or not this was overall disutility for myself is debatable.

Comment author: Decius 16 July 2012 12:54:12AM 3 points [-]

If you work for maximum signaling value, what is the likelihood that you are also working for maximum productive value? Unless the signalling is completely without noise, the most effective signalling behavior will be less productive than the most productive behavior.