lukeprog comments on Reply to Holden on The Singularity Institute - Less Wrong

46 Post author: lukeprog 10 July 2012 11:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (213)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 11 July 2012 05:53:16PM 1 point [-]

I reject the paraphrase, and the test you link to involved a lot more than the CRT.

Comment author: siodine 11 July 2012 06:08:37PM *  -1 points [-]

I reject the paraphrase

Why?

Direct quotes:

Holden: To me, the best evidence of superior general rationality (or of insight into it) would be objectively impressive achievements (successful commercial ventures, highly prestigious awards, clear innovations, etc.) and/or accumulation of wealth and power. As mentioned above, SI staff/supporters/advocates do not seem particularly impressive on these fronts...

That is synonymous with success in Western society. His definition of superior general rationality or insight (read: instrumental and epistemic rationality) fits with my paraphrase of that direct quote.

Luke: Unfortunately, this seems to misunderstand the term "rationality" as it is meant in cognitive science. As I explained elsewhere:

You think his definition is wrong.

Luke: Like intelligence and money, rationality is only a ceteris paribus predictor of success. So while it's empirically true (Stanovich 2010) that rationality is a predictor of life success, it's a weak one. (At least, it's a weak predictor of success at the levels of human rationality we are capable of training today.) If you want to more reliably achieve life success, I recommend inheriting a billion dollars or, failing that, being born+raised to have an excellent work ethic and low akrasia.

I.e., we shouldn't necessarily expect rational people to be successful. The only problem I see with my paraphrase is in explaining why some people aren't successful given that they're rational (per your definition), which is by having atypical goals. Well, that should make sense if they're instrumentally rational (of course, this discounts luck. but i don't think luck is an overriding factor on average, here.)

the test you link to involved a lot more than the CRT.

This isn't useful information unless you also link to the other tests and show why they're meaningful after training to do well on them. I would take it out of your argument, as is. (Also, it's a spider web of links -- which I've read before).

Comment author: lukeprog 11 July 2012 06:24:05PM 2 points [-]

Your paraphrase of me was:

Holden expects us to have epistemic and instrumental powers of rationality that would make us successful in Western society, however this is a strawman. Being rational isn't succeeding in society, but succeeding at your own goals.

But I didn't think that what Holden got wrong was a confusion between one's own goals and "success in Western society" goals. Many of SI's own goals include "success in Western society" goals like lots of accumulated wealth and power. Instead, what I thought Holden got wrong was his estimate of the relation between rationality and success.

Re: the testing. LWers hadn't trained specifically for the battery of tests given them that day, but they outperformed every other group I know of who has taken those tests. I agree that these data aren't as useful as the data CFAR is collecting now about the impact of rationality training on measures of life success, but they are suggestive enough to support a weak, qualified claim like the one I made, that "it seems" like LWers are more rational than the general population.

Comment author: komponisto 11 July 2012 07:02:38PM 9 points [-]

It occurs to me that Holden's actual reasoning (never mind what he said) is perhaps not about rationality per se and instead may be along these lines: "Since SI staff haven't already accumulated wealth and power, they probably suffer from something like insufficient work ethic or high akrasia or not-having-inherited-billions, and thus will probably be ineffective at achieving the kind of extremely-ambitious goals they have set for themselves."

Comment author: lmm 10 February 2013 08:04:34PM 1 point [-]

It may or may not be Holden's, but I think you've put your finger on my real reasons for not wanting to donate to SI. I'd be interested to hear any counterpoint.

Comment author: siodine 11 July 2012 06:35:34PM *  0 points [-]

But I didn't think that what Holden got wrong was a confusion between one's own goals and "success in Western society" goals. Many of SI's own goals include "success in Western society" goals like lots of accumulated wealth and power. Instead, what I thought Holden got wrong was his estimate of the relation between rationality and success.

Right, then I (correctly, I think) took your reasoning a step farther than you did. The SI's goals don't necessarily correspond with its members' goals. SIers may be there because they want to be around a lot of cool people, and may not have any particular desire for being successful (I suspect many of them do). But this discounts luck, like luck in being born conscientiousness -- the power to accomplish your goals. And like I said, poor luck like that is unconvincing when applied to a group of people.

that "it seems" like LWers are more rational than the general population.

When I say "it seems", being an unknown here, people will likely take me to be reporting an anecdote. When you, the executive director of SI and a researcher on this topic, says "it seems" I think people will take it as a weak impression of the available research. Scientists adept at communicating with journalists get around this by saying "I speculate" instead.