Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Algernoq 14 July 2014 12:11:15AM 1 point [-]

Glad to hear it!

For others considering a PhD: usually the best (funded) PhD program you got into is a good choice for you. But only do it if you enjoy research/learning for its own sake.

Comment author: jsteinhardt 14 July 2014 01:18:54AM 4 points [-]

Tangential, but:

usually the best (funded) PhD program you got into is a good choice for you. But only do it if you enjoy research/learning for its own sake.

I'm not sure I agree with this, except insofar as any top-tier or even second-tier program will pay for your graduate education, at least in engineering fields, and so if they do not then that is a major red flag. I would say that research fit with your advisor, caliber of peers, etc. is much more important.

Comment author: Viliam_Bur 13 July 2014 09:34:07PM *  26 points [-]

I feel like the more important question is: How specifically has LW succeeded to make this kind of impression on you? I mean, are we so bad at communicating our ideas? Because many things you wrote here seem to me like quite the opposite of LW. But there is a chance that we really are communicating things poorly, and somehow this is an impression people can get. So I am not really concerned about the things you wrote, but rather about a fact that someone could get this impression. Because...

Rationality doesn't guarantee correctness.

Which is why this site is called "Less Wrong" in the first place. (Instead of e.g. "Absolutely Correct".) On many places in Sequences it is written that unlike the hypothetical perfect Bayesian reasoner, human are pretty lousy at processing available evidence, even when we try.

deciding what to do in the real world requires non-rational value judgments

Indeed, this is why a rational paperclip maximizer would create as many paperclips as possible. (The difference between irrational and rational paperclip maximizers is that the latter has a better model of the world, and thus probably succeeds to create more paperclips on average.)

Many LWers seem to assume that being as rational as possible will solve all their life problems.

Let's rephrase it with "...will provide them a better chance at solving their life problems."

instead, a better choice is to find more real-world data about outcomes for different life paths, pick a path (quickly, given the time cost of reflecting), and get on with getting things done.

Not sure exactly what you suggest here. We should not waste time reflecting, but instead pick a path quickly, because time is important. But we should find data. Uhm... I think that finding the data, and processing the data takes some time, so I am not sure whether you recommend doing it or not.

LW recruiting (hpmor, meetup locations near major universities) appears to target socially awkward intellectuals (incl. me) who are eager for new friends and a "high-status" organization to be part of, and who may not have many existing social ties locally.

You seem to suggest some sinister strategy is used here, but I am not sure what other approach would you recommend as less sinister. Math, science, philosophy... are the topics mostly nerds care about. How should we do a debate about math, science and philosophy in a way that will be less attractive to nerds, but will attract many extraverted highly-social non-intellectuals, and the debate will produce meaningful results?

Because I think many LWers would actually not oppose trying that, if they believed such thing was possible and they could organize it.

LW members who are conventionally successful (e.g. PhD students at top-10 universities) typically became so before learning about LW

This is not a strong evidence against usefulness of LW. If you imagine a parallel universe with alternative LW that does increase average success of its readers, then even in that parallel universe, most of the most impressive LW readers became that impressive before reading LW. It is much easier to attract a PhD student at a top university by a smart text, than to attract a smart-but-not-so-awesome person and make them a PhD student at a top university during the next year or two.

For example, the reader may be of a wrong age to become a PhD student during the time they read LW; they may be too young or too old. Or the reader may have done some serious mistakes in the past (e.g. choosing a wrong university) that even LW cannot help overcome in the limited time. Or the reader may be so far below the top level, that even making them more impressive is not enough to get them PhD at a top university.

the LW community may or may not ... encourage them ... to drop out of their PhD program, go to "training camps" for a few months ...

WTF?! Please provide an evidence of LW encouraging PhD students at top-10 universities to drop out of their PhD program to go to LW "training camps" (which by the way don't take a few months -- EDIT: I was wrong, actually there was one).

Here is a real LW discussion with a PhD student; you can see what a realistic LW advice would look like. Here is some general study advice. Here is a CFAR "training camp" for students, and it absolutely doesn't require anyone to drop out of the school... hint: it takes two weeks in August.

In summary: real LW does not resemble the picture you described, and is sometimes actually more close to the opposite of it.

Comment author: jsteinhardt 13 July 2014 11:41:45PM 8 points [-]

WTF?! Please provide an evidence of LW encouraging PhD students at top-10 universities to drop out of their PhD program to go to LW "training camps" (which by the way don't take a few months).

When I visited MIRI one of the first conversations I had with someone was them trying to convince me not to pursue a PhD. Although I don't know anything about the training camp part (well, I've certainly been repeatedly encouraged to go to a CFAR camp, but that is only a weekend and given that I teach for SPARC it seems like a legitimate request).

Comment author: jsteinhardt 13 July 2014 11:36:45PM *  10 points [-]

Hi Algernoq,

Thanks for writing this. This sentence particularly resonated:

LW members who are conventionally successful (e.g. PhD students at top-10 universities) typically became so before learning about LW, and the LW community may or may not support their continued success (e.g. may encourage them, with only genuine positive intent, to spend a lot of time studying Rationality instead of more specific skills).

I was definitely explicitly discouraged from pursuing a PhD by certain rationalists and I think listening to their advice would have been one of the biggest mistakes of my life. Unfortunately I see this attitude continuing to be propagated so I am glad that you are speaking out against it.

EDIT: Although, it looks like you've changed my favorite part! The text that I quoted the above was not the original text (which talked more about dropping out of PhD and starting a start-up).

Comment author: Viliam_Bur 13 July 2014 09:34:07PM *  26 points [-]

I feel like the more important question is: How specifically has LW succeeded to make this kind of impression on you? I mean, are we so bad at communicating our ideas? Because many things you wrote here seem to me like quite the opposite of LW. But there is a chance that we really are communicating things poorly, and somehow this is an impression people can get. So I am not really concerned about the things you wrote, but rather about a fact that someone could get this impression. Because...

Rationality doesn't guarantee correctness.

Which is why this site is called "Less Wrong" in the first place. (Instead of e.g. "Absolutely Correct".) On many places in Sequences it is written that unlike the hypothetical perfect Bayesian reasoner, human are pretty lousy at processing available evidence, even when we try.

deciding what to do in the real world requires non-rational value judgments

Indeed, this is why a rational paperclip maximizer would create as many paperclips as possible. (The difference between irrational and rational paperclip maximizers is that the latter has a better model of the world, and thus probably succeeds to create more paperclips on average.)

Many LWers seem to assume that being as rational as possible will solve all their life problems.

Let's rephrase it with "...will provide them a better chance at solving their life problems."

instead, a better choice is to find more real-world data about outcomes for different life paths, pick a path (quickly, given the time cost of reflecting), and get on with getting things done.

Not sure exactly what you suggest here. We should not waste time reflecting, but instead pick a path quickly, because time is important. But we should find data. Uhm... I think that finding the data, and processing the data takes some time, so I am not sure whether you recommend doing it or not.

LW recruiting (hpmor, meetup locations near major universities) appears to target socially awkward intellectuals (incl. me) who are eager for new friends and a "high-status" organization to be part of, and who may not have many existing social ties locally.

You seem to suggest some sinister strategy is used here, but I am not sure what other approach would you recommend as less sinister. Math, science, philosophy... are the topics mostly nerds care about. How should we do a debate about math, science and philosophy in a way that will be less attractive to nerds, but will attract many extraverted highly-social non-intellectuals, and the debate will produce meaningful results?

Because I think many LWers would actually not oppose trying that, if they believed such thing was possible and they could organize it.

LW members who are conventionally successful (e.g. PhD students at top-10 universities) typically became so before learning about LW

This is not a strong evidence against usefulness of LW. If you imagine a parallel universe with alternative LW that does increase average success of its readers, then even in that parallel universe, most of the most impressive LW readers became that impressive before reading LW. It is much easier to attract a PhD student at a top university by a smart text, than to attract a smart-but-not-so-awesome person and make them a PhD student at a top university during the next year or two.

For example, the reader may be of a wrong age to become a PhD student during the time they read LW; they may be too young or too old. Or the reader may have done some serious mistakes in the past (e.g. choosing a wrong university) that even LW cannot help overcome in the limited time. Or the reader may be so far below the top level, that even making them more impressive is not enough to get them PhD at a top university.

the LW community may or may not ... encourage them ... to drop out of their PhD program, go to "training camps" for a few months ...

WTF?! Please provide an evidence of LW encouraging PhD students at top-10 universities to drop out of their PhD program to go to LW "training camps" (which by the way don't take a few months -- EDIT: I was wrong, actually there was one).

Here is a real LW discussion with a PhD student; you can see what a realistic LW advice would look like. Here is some general study advice. Here is a CFAR "training camp" for students, and it absolutely doesn't require anyone to drop out of the school... hint: it takes two weeks in August.

In summary: real LW does not resemble the picture you described, and is sometimes actually more close to the opposite of it.

Comment author: jsteinhardt 13 July 2014 11:34:09PM 7 points [-]

I mean, are we so bad at communicating our ideas?

I find this presumption (that the most likely cause for disagreement is that someone misunderstood you) to be somewhat abrasive, and certainly unproductive (sorry for picking on you in particular, my intent is to criticize a general attitude that I've seen across the rationalist community and this thread seems like an appropriate place). You should consider the possibility that Algernoq has a relatively good understanding of this community and that his criticisms are fundamentally valid or at least partially valid. Surely that is the stance that offers greater opportunity for learning, at the very least.

Comment author: jsteinhardt 13 July 2014 11:29:38PM *  8 points [-]

Rather, the problem is that at least one celebrated authority in the field hates that, and would prefer much, much more deference to authority.

I don't think this is true at all. His points against replicability are very valid and match my experience as a researcher. In particular:

Because experiments can be undermined by a vast number of practical mistakes, the likeliest explanation for any failed replication will always be that the replicator bungled something along the way.

This is a very real issue and I think that if we want to solve the current issues with science we need to be honest about this, rather than close our eyes and repeat the mantra that replication will solve everything. And it's not like he's arguing against accountability. Even in your quoted passage he says:

The field of social psychology can be improved, but not by the publication of negative findings. Experimenters should be encouraged to restrict their “degrees of freedom,” for example, by specifying designs in advance.

Now, I think he goes too far by saying that no negative findings should be published; but I think they need to be held to a high standard for the very reason he gives. On the other hand, positive findings should also be held to a higher standard.

Note that there are people much wiser than me (such as Andrew Gelman) who disagree with me; Gelman is dissatisfied with the current presumption that published research is correct. I certainly agree with this but for the same reasons that Mitchell gives, I don't think that merely publishing negative results can fix this issue.

Either way, I think you are being quite uncharitable to Mitchell.

Comment author: somervta 11 July 2014 02:28:49AM 14 points [-]

So, after reading the comments, I figure I should speak up because selection effects

I appreciated the deleting of the original post. I thought it was silly, and pointless and not what should be on LW. I didn't realize it was being upvoted (or I would have downvoted it), and I still don't know why it was.

I endorse the unintrusive (i.e, silent and unannounces) deleting of things like this (particularly given that the author was explicitly not taking the posting seriously - written while drunk, etc), and I suspect others do as well.

There's a thing that happens wherein any disagreement with moderation ends up being much more noticable than agreement. I wouldn't be surprised if there were many who, like me, agreed with decisions like this and weren't speaking up. If so, I urge you to briefly comment (even just "I agree/d with the decision to delete").

Comment author: jsteinhardt 11 July 2014 07:23:38AM 2 points [-]

Maybe a poll would be better?

Comment author: jsteinhardt 10 July 2014 12:44:55AM 0 points [-]

Old math and computing Olympiad problems are good for testing problem solving skills.

Comment author: Username 06 July 2014 06:06:23PM 2 points [-]

Then one day, a foreign missionary enters your country in order to start spreading his religion among the naïve and carefree people of your land. His religion states things that not strictly illegal but which are incredibly harmful: You have to believe everything it says or you will be burned to death, you have to verbally abuse your children daily to raise them properly, you must reject anyone from society who holds ideas the religion disagree with, science and critical thinking are wicked, and so on and so forth. This religion goes against everything you value and what’s worse, it seems to be catching on.

If the religion is so obviously harmful why is it catching on? To paraphrase Kaj, why is it the place of individual people to decide that this religion needs censorship?

Comment author: jsteinhardt 06 July 2014 06:48:24PM 5 points [-]

Farmville must be an excellent game because so many people play it.

Comment author: Kaj_Sotala 05 July 2014 11:42:22AM 9 points [-]

So with several of the other users that Eugine had hit, the difference between his downvote total and that of the second-highest downvoter was quite drastic: in one case, there were 26 times as many downvotes from Eugine as from the second highest downvoter.

The pattern is different in your case: the top ten downvote balances against your account are 150, 74, 55, 36, 32, 31, 28, 20, 19, 17. (Eugine doesn't appear to have hit you, as he isn't included in this list.) It's plausible that the 150 person is a mass downvoter, and also that the 74 person is, given that the 74 person also had a suspiciously high downvote count towards another person. But at the same time, it also looks like there were a lot of people downvoting your comments. If I assume that most of the users in this list were "legitimate" downvoters, then I'm unsure of whether this data alone is sufficient to indicate exactly who the mass downvoter(s) was. The 150 person is the most likely culprit, but maybe it was several of the lower-ranking ones acting independently from each other, and the 150 one just happened to see a lot of your comments that he didn't naturally like? Whose downvotes should I have reversed, and whose should I let stand?

Then again, I don't know how large of a fraction 150 comments is of your total comment history: if it's a high percentage, then then it sounds more plausible that the person in question is indeed a mass-downvoter, since it would be much more unlikely for them to run into 150 of your comments that they just naturally disliked.

And now I have the feeling that the rational course of action would be to pick some percentage where I'd act as if this was a confirmed mass-downvoter, before hearing the answer to "what percentage of your comments is this"... but I don't have a very good clue of where I should set the burden of evidence in cases where the situation isn't blazingly obvious.

Comment author: jsteinhardt 05 July 2014 08:14:28PM *  5 points [-]

Just based on brazil84's karma total, the 150 number seems unlikely to be more than 50% of brazil84's posts. It seems very much within the margin of statistical error that there would be a number that high, especially given the other users with large numbers of downvotes against brazil84. I think reversing the votes on this amount of evidence would be a pretty big stretch, fwiw (despite being strongly in favor of the earlier ban as well as reversing all of Eugine's votes).

Comment author: jsteinhardt 05 July 2014 09:56:36AM 0 points [-]

If you go to a relatively good high school, one of the worst aspects of it is the fact that most of the advanced classes (e.g. AP classes) spend most of their time "teaching to the exam" rather than focusing on providing knowledge. Doing the same for college would, in my opinion, completely ruin the university experience. Part of the point of college is to give students the freedom to explore their interests. Grading is really a very small point of the entire endeavor.

(I single out "relatively good high schools" because I imagine for most high schools the alternative to an AP class would be no class, so the AP class is probably an improvement. At a sufficiently good high school the alternative would be a college-level class taught by a local professor.)

View more: Next