Will_Newsome comments on Two Truths and a Lie - Less Wrong

59 Post author: Psychohistorian 23 December 2009 06:34AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (66)

You are viewing a single comment's thread. Show more comments above.

Comment author: pjeby 23 December 2009 04:18:32PM 15 points [-]

it's real easy to look at a behavior that doesn't seem to make sense otherwise and say "oh, duh, signalling". The key is that the behavior doesn't make sense otherwise: it's costly, and that's an indication that, if people are doing it, there's a benefit you're not seeing.

People do all sorts of insane things for reasons other than signaling, though. Like because their parents did it, or because the behavior was rewarded at some point.

Of course, signaling behavior is often rewarded, due to it being successful signaling... which means it might be more accurate to say that people do things because they've been rewarded at some point for doing them, and it just so happens that signaling behavior is often rewarded.

(Which is just the sort of detail we would want to see from a good theory of signaling -- or anything else about human behavior.)

Unfortunately, the search for a Big Idea in human behavior is kind of dangerous. Not just because a big-enough idea gets close to being tautological, but also because it's a bad idea to assume that people are sane or do things for sane reasons!

If you view people as stupid robots that latch onto and imitate the first patterns they see that produce some sort of reward (as well as freezing out anything that produces pain early on) and then stubbornly refusing to change despite all reason, then that's definitely a Big Idea enough to explain nearly everything important about human behavior.

We just don't like that idea because it's not beautiful and elegant, the way Big Ideas like evolution and relativity are.

(It's also not the sort of idea we're looking for, because we want Big Ideas about psychology to help us bypass any need to understand individual human beings and their tortured histories, or even look at what their current programming is. Unfortunately, this is like expecting a Theory of Computing to let us equally predict obscure problems in Vista and OS X, without ever looking at their source code or development history of either one.)

Comment author: Will_Newsome 13 September 2010 11:10:38PM 1 point [-]

(It's also not the sort of idea we're looking for, because we want Big Ideas about psychology to help us bypass any need to understand individual human beings and their tortured histories, or even look at what their current programming is. Unfortunately, this is like expecting a Theory of Computing to let us equally predict obscure problems in Vista and OS X, without ever looking at their source code or development history of either one.)

So do you think e.g. overcoming akrasia necessitates understanding your self-programming via a set of decent algorithms for doing so (e.g. what Less Wrong is for epistemic rationality) that allow you to figure out for yourself whatever problems you may have? That would be a little worrying insofar as something like akrasia might be similar to a blue screen of death in your Theory of Computing example: a common failure mode resulting from any number of different problems that can only be resolved by the application of high-level learned algorithms that most people simply don't have and never bother to find, and those who do find are unable to succinctly express in such a way as to be memetically fit.

On top of that, similar to how most people never notice that they're horrible epistemic rationalists and that there is a higher standard to which they could aspire, most good epistemic rationalists themselves may at least notice that they're sub-par along many dimensions of instrumental rationality and yet completely fail to be motivated to do anything about it: they pride themselves on being correct, not being successful, in the same way most people pride themselves on their success and not their correctness (by gerrymandering their definition of correctness to be success like rationalists may gerrymander their definition of success to be correctness, resulting in both of them losing by either succeeding at the wrong things or failing to succeed at the right things).

Comment author: pjeby 14 September 2010 05:12:55PM 2 points [-]

So do you think e.g. overcoming akrasia necessitates understanding your self-programming via a set of decent algorithms for doing so (e.g. what Less Wrong is for epistemic rationality) that allow you to figure out for yourself whatever problems you may have?

Yes; see here for why.

Btw, it would be more accurate to speak of "akrasias" as individual occurrences, rather than "akrasia" as a non-countable. One can overcome an akrasia, but not "akrasia" in some general sense.

they pride themselves on being correct, not being successful

Yep, major failure mode. Been there, done that. ;-)

Comment author: NancyLebovitz 14 September 2010 07:48:20PM -2 points [-]

Btw, it would be more accurate to speak of "akrasias" as individual occurrences, rather than "akrasia" as a non-countable. One can overcome an akrasia, but not "akrasia" in some general sense.

I bet you think the war on terror is a badly framed concept.