You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Gunnar_Zarncke comments on The Market for Lemons: Quality Uncertainty on Less Wrong - Less Wrong Discussion

8 Post author: signal 18 November 2015 10:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (43)

You are viewing a single comment's thread. Show more comments above.

Comment author: moridinamael 19 November 2015 04:19:47PM 7 points [-]

Yeah, this highlights my overall issue with the OP.

Elon Musk's path to success is well-known and not replicable. His story relies too much on (1) luck and (2) high-IQ-plus-ultra-high-conscientiousness, in that order of importance. Elon Musk is a red herring in these discussions.

More to the point, there is already an absurd overabundance of available information about how to be quite successful in business. It is not to the comparative advantage of LW to try to replicate this type of content. Likewise, the Internet hosts an absurd overabundance of practical, useful advice on

  • how to exercise, with the aim of producing any given physical result
  • how to succeed at dating, to whatever end desired
  • how to manage one's personal finances
  • etc.

It is not the role of LW to comprehensively answer all these questions. LW has always leaned more toward rationality qua rationality. More strategy, less tactics.

Also, I think the OP is attacking a straw man to a large degree. Nobody here thinks that LW has already emmanetized the eschaton. Nobody here thinks the LW has already solved rationality. We're just a group of people interested in thinking about and discussing these types of considerations.


All that said, when I first discovered LW (and particularly the Sequences), it was such a cognitive bombshell that I did genuinely expect that my life and mind would be completely changed. And that expectation was sort of borne out, but in ways that only make sense in a sort of post hoc fashion. As in, I used LW-inspired-cognition for a lot of major life choices, but it's impossible to do A/B testing and determine if those were the right choices, because I don't have access to the world where I made the opposite choice. (People elsewhere in this very comment thread repeat the meme that "LWers are not more rational than average." Well, how would you know if they were? What does that even mean?)

Comment author: Gunnar_Zarncke 21 November 2015 01:22:34PM 0 points [-]

Just because the example wasn't well-chosen that doesn't invalidate the argument per se.