Comment author: ScottMessick 12 July 2012 06:54:23PM *  16 points [-]

I had long ago (but after being heavily influenced by Overcoming Bias) thought that signaling could be seen simply as a corollary to Bayes' theorem. That is, when one says something, one knows that its effect on a listener will depend on the listener's rational updating on the fact that one said it. If one wants the listener to behave as if X is true, one should say something that the listener would only expect in case X is true.

Thinking in this way, one quickly arrives at conclusions like "oh, so hard-to-fake signals are stronger" and "if everyone starts sending the same signal in the same way, that makes it a lot weaker", which test quite well against observations of the real world.

Powerful corollary: we should expect signaling, along with these basic properties, to be prominent in any group of intelligent minds. For example, math departments and alien civilizations. (Non-example: solitary AI foom.)

Comment author: kalos 13 July 2012 02:07:57PM 4 points [-]

This article made me think the same thing. Signaling is essentially gaming Bayes Theorem: providing what one believes others to count as evidence of appropriate strength to get them to update to a desired conclusion.