You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Algernoq comments on Why I Am Not a Rationalist, or, why several of my friends warned me that this is a cult - Less Wrong Discussion

12 Post author: Algernoq 13 July 2014 05:54PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (192)

You are viewing a single comment's thread. Show more comments above.

Comment author: Algernoq 13 July 2014 11:13:06PM 2 points [-]

are we so bad at communicating our ideas?

Not generally -- I keep coming back for the clear, on-topic, well-reasoned, non-flame discussion.

Not sure exactly what you suggest here. We should not waste time reflecting...but...

Many (I guess 40-70%) of meetups and discussion topics are focused on pursuing rational decision-making for self-improvement. Honestly I feel guilty about not doing more work and I assume other readers are here not because it's optimal but because it's fun.

There's also a sentiment that being more Rational would fix problems. Often, it's a lack of information, not a lack of reasoning, that's causing the problem.

This is not a strong evidence against usefulness of LW.

I agree, and I agree LW is frequently useful. I would like to see more reference of non-technical experts for non-technical topics. As an extreme example, I'm thinking of a forum post where some (presumably young) poster asked for a Bayesian estimate on whether a "girl still liked him" based on her not calling, upvoted answers containing Bayes' Theorem and percentage numbers, and downvoted my answer telling him he didn't provide enough information. More generally, I think there can be a similar problem to that in some Christian literature where people will take "(X) Advice" because they are part of the (X) community even though the advice is not the best available advice.

Essentially, I think the LW norms should encourage people to learn proven technical skills relevant to their chosen field, and should acknowledge that it's only advisable to think about Rationality all day if that's what you enjoy for its own sake. I'm not sure to what extent you already agree with this.

A few LW efforts appear to me to be sub-optimal and possibly harmful to those pursuing them, but this isn't the place for that argument.

How should we do a debate about math, science and philosophy... for non-intellectuals?

Not answering this question is limiting the spread of LW, because it's easy to dismiss people as not sufficiently intellectual when they don't join the group. I don't know the answer here.

A movement aiming to remove errors in thinking is claiming a high standard for being right.

WTF?! Please provide an evidence of LW encouraging PhD students at top-10 universities to drop out

The PhD student dropping out of a top-10 school to try to do a startup after attending a month-long LW event I heard secondhand from a friend. I will edit my post to avoid spreading rumors, but I trust the source.

real LW does not resemble the picture you described

I'm glad your experience has been more ideal.

Comment author: Viliam_Bur 14 July 2014 07:03:17AM *  7 points [-]

The PhD student dropping out of a top-10 school to try to do a startup after attending a month-long LW event I heard secondhand from a friend. I will edit my post to avoid spreading rumors, but I trust the source.

If it did happen, then I want to know that it happened. It's just that this is the first time I even heard about a month-long LW event. (Which may be an information about my ignorance -- EDIT: it was, indeed --, since till yesterday I didn't even know SPARC takes two weeks, so I thought one week was a maximum for an LW event.)

I heard a lot of "quit the school, see how successful and rich Zuckerberg is" advice, but it was all from non-LW sources.

I can imagine people at some LW meetup giving this kind of advice, since there is nothing preventing people with opinions of this kind to visit LW meetups and give advice. It just seems unlikely, and it certainly is not the LW "crowd wisdom".

Comment author: kbaxter 16 July 2014 02:23:27AM *  6 points [-]

Here's the program he went to, which did happen exactly once. It was a precursor to the much shorter CFAR workshops: http://lesswrong.com/lw/4wm/rationality_boot_camp/

That said, as his friend I think the situation is a lot less sinister than it's been made out to sound here. He didn't quit to go to the program, he quit a year or so afterwards to found a startup. He wasn't all that excited about his PHD program and he was really excited about startups, so he quit and founded a startup with some friends.

Comment author: Viliam_Bur 16 July 2014 09:00:01AM 0 points [-]

Thanks!

Now I remember I heard about that in the past, but I forgot completely. It actually took ten weeks!

Comment author: TheMajor 14 July 2014 12:32:40AM 4 points [-]

Often, it's a lack of information, not a lack of reasoning, that's causing the problem.

Embracing the conclusion implied by new information even if it is in disagreement with your initial guess is a vital skill that many people do not have. I was first introduced to this problem here on LW. Of course your claim might still be valid, but I'd like to point out that some members (me) wouldn't have been able to take your advice if it wasn't for the material here on LW.

I'm thinking of a forum post where some (presumably young) poster asked for a Bayesian estimate on whether a "girl still liked him" based on her not calling

The problem with this example is really interesting - there exists some (subjectively objective) probabily, which we can find with Bayesian reasoning. Your recommendation is meta-advice, rather than attempting to find this probability you suggest investing some time and effort to get more evidence. I don't see why this would deserve downvotes (rather I would upvote it, I think), but note that a response containing percentages and Bayes' Theorem is an answer to the question.

Comment author: ChristianKl 14 July 2014 12:29:13PM 3 points [-]

Saying you didn't provide enough information for a probability estimate deserves downvotes because it misses the point. You can give probability estimates based on any information that's presented. The probability estimate will be better with more information but it's still possible to do an estimate with low information.

Comment author: Luke_A_Somers 05 September 2014 07:43:30PM 0 points [-]

Using a Value of Information calculation would be best, especially if tied to proposed experiments.

Comment author: ThisSpaceAvailable 14 July 2014 08:17:35AM 0 points [-]

I'd just like to point out that ranking is a function of both the school and the metric, and thus the phrase "top-10 school" is not really well-formed. While it does convey significant information, it implies undue precision, and allowing people sneak in unstated metrics is problematic.

Comment author: ChristianKl 14 July 2014 12:12:12PM *  0 points [-]

At the same time you seem to criticise LW for being self help and for approaching rationality in an intellectual way that doesn't maximize life outcomes.

I do think plenty of people on LW do care about rationality in an intellectual way and do care for developing the idea of rationality and for questions such as what happens when we apply Bayes theorem for situations where it usually isn't applied.

In the case of deciding whether "a girl still likes a guy" a practical answer focused on the situation would probably encourage the guy to ask the girl out. As you describe the situation nobody actually gave the advice the calculating probabilities is a highly useful way to deal with the issue.

However that doesn't mean that the question of applying Bayes theorem to the situation is worthless. You might learn something about practical application of Bayes theorem. You also get probability numbers that you could use to calibrate yourself.

Do you argue that calibrating your prediction for high stakes emotional situations isn't a skill worth exploring because we live in a world where nearly nobody is good at making calibrated predictions in high stakes emotional situations, because there nobody that actually good at it?

At LW we try to do something new. The fact that new ideas often fail doesn't imply that we shouldn't experiment with new ideas. If you aren't curious about exploring new ideas and only want practical advice, LW might not be the place for you.

The simple aspect of feeling agentship in the face of uncertainty also shouldn't be underrated.

The PhD student dropping out of a top-10 school to try to do a startup after attending a month-long LW event I heard secondhand from a friend.

Are you arguing that there aren't cases where a PhD student has a great idea for a startup and shouldn't put that idea into practice and leave his PhD? Especially when he might have got the connection to secure the necessary venture capital?

The PhD student dropping out of a top-10 school to try to do a startup after attending a month-long LW event I heard secondhand from a friend.

I don't know about month-long LW events expect maybe internships with an LW affiliated organisation. Doing internships in general can bring people to do something they wouldn't have thought about before.

Comment author: Algernoq 15 July 2014 12:10:31AM 1 point [-]

Do you argue that calibrating your prediction for high stakes emotional situations isn't a skill worth exploring ...?

No, I agree it's generally a worthwhile skill. I objected to the generalization from insufficient evidence, when additional evidence was readily available.

At LW we try to do something new. The fact that new ideas often fail doesn't imply that we shouldn't experiment with new ideas. If you aren't curious about exploring new ideas and only want practical advice, LW might not be the place for you.

I guess what's really bothering me here is that less-secure or less-wise people can be taken advantage of by confident-sounding higher-status people. I suppose this is no more true in LW than in the world at large. I respect trying new things.

The simple aspect of feeling agentship in the face of uncertainty also shouldn't be underrated.

Hooray, agency! This is a question I hope to answer.

Are you arguing that there aren't cases where a PhD student has a great idea for a startup and shouldn't put that idea into practice and leave his PhD? Especially when he might have got the connection to secure the necessary venture capital?

I'm arguing that it was the wrong move in this case, and hurt him and others. In general, most startups fail, ideas are worthless compared to execution, and capital is available to good teams.

Comment author: kbaxter 16 July 2014 02:28:26PM *  3 points [-]

By what metric was his decision wrong?

If he's trying to maximize expected total wages over his career, staying in academia isn't a good way to do that. Although he'd probably be better off at a larger, more established company than at a startup.

If he's trying to maximize his career satisfaction, and he wasn't happy in academia but was excited about startups, he made a good decision. And I think that was the case here.

Some other confounding factors about his situation at the time:

  • He'd just been accepted to YCombinator, which is a guarantee of mentoring and venture capital

  • Since he already had funding, it's not like he was dumping his life savings into a startup expecting a return

  • He has an open invitation to come back to his PHD program whenever he wants

If you still really want to blame someone for his decision, I think Paul Graham had a much bigger impact on him than anyone associated with LessWrong did.

Comment author: Algernoq 16 July 2014 04:26:02PM *  5 points [-]

YC funding is totally worth going after! He made the right choice given that info. That's what I get for passing on rumors.

Comment author: ChristianKl 15 July 2014 08:18:17AM *  0 points [-]

No, I agree it's generally a worthwhile skill. I objected to the generalization from insufficient evidence, when additional evidence was readily available.

It's an online discussion. There a bunch of information that might not be shared because it's too private to be shared online. I certainly wouldn't share all information about a romantic interaction on LW. But I might share enough information to ask an interesting question.

I do consider this case to be an interesting question. I like it when people discuss abstract principles like rational decision making via Bayes theorem based on practical real life example instead of only taking far out thought experiments.

I'm arguing that it was the wrong move in this case, and hurt him and others.

If I'm understanding you right, you don't even know the individual in question. People drop out of Phd programs all the time. I don't think you can say whether or not they have good reasons for doing so without investigating the case on an individual basis.