Filter Last three months

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Dark Arts of Rationality

114 So8res 19 January 2014 02:47AM

Today, we're going to talk about Dark rationalist techniques: productivity tools which seem incoherent, mad, and downright irrational. These techniques include:

  1. Willful Inconsistency
  2. Intentional Compartmentalization
  3. Modifying Terminal Goals

I expect many of you are already up in arms. It seems obvious that consistency is a virtue, that compartmentalization is a flaw, and that one should never modify their terminal goals.

I claim that these 'obvious' objections are incorrect, and that all three of these techniques can be instrumentally rational.

In this article, I'll promote the strategic cultivation of false beliefs and condone mindhacking on the values you hold most dear. Truly, these are Dark Arts. I aim to convince you that sometimes, the benefits are worth the price.

continue reading »

Lifestyle interventions to increase longevity

96 RomeoStevens 28 February 2014 06:28AM

There is a lot of bad science and controversy in the realm of how to have a healthy lifestyle. Every week we are bombarded with new studies conflicting older studies telling us X is good or Y is bad. Eventually we reach our psychological limit, throw up our hands, and give up. I used to do this a lot. I knew exercise was good, I knew flossing was good, and I wanted to eat better. But I never acted on any of that knowledge. I would feel guilty when I thought about this stuff and go back to what I was doing. Unsurprisingly, this didn't really cause me to make any positive lifestyle changes.

Instead of vaguely guilt-tripping you with potentially unreliable science news, this post aims to provide an overview of lifestyle interventions that have very strong evidence behind them and concrete ways to implement them.

continue reading »

Tell Culture

89 BrienneStrohl 18 January 2014 08:13PM

Followup to: Ask and Guess

Ask culture: "I'll be in town this weekend for a business trip. Is it cool if I crash at your place?" Response: “Yes“ or “no”.

Guess culture: "Hey, great news! I'll be in town this weekend for a business trip!" Response: Infer that they might be telling you this because they want something from you, conclude that they might want a place to stay, and offer your hospitality only if you want to. Otherwise, pretend you didn’t infer that.

The two basic rules of Ask Culture: 1) Ask when you want something. 2) Interpret things as requests and feel free to say "no".

The two basic rules of Guess Culture: 1) Ask for things if, and *only* if, you're confident the person will say "yes". 2) Interpret requests as expectations of "yes", and, when possible, avoid saying "no".

Both approaches come with costs and benefits. In the end, I feel pretty strongly that Ask is superior. 

But these are not the only two possibilities!

"I'll be in town this weekend for a business trip. I would like to stay at your place, since it would save me the cost of a hotel, plus I would enjoy seeing you and expect we’d have some fun. I'm looking for other options, though, and would rather stay elsewhere than inconvenience you." Response: “I think I need some space this weekend. But I’d love to get a beer or something while you’re in town!” or “You should totally stay with me. I’m looking forward to it.”

There is a third alternative, and I think it's probably what rationalist communities ought to strive for. I call it "Tell Culture".

The two basic rules of Tell Culture: 1) Tell the other person what's going on in your own mind whenever you suspect you'd both benefit from them knowing. (Do NOT assume others will accurately model your mind without your help, or that it will even occur to them to ask you questions to eliminate their ignorance.) 2) Interpret things people tell you as attempts to create common knowledge for shared benefit, rather than as requests or as presumptions of compliance.

Suppose you’re in a conversation that you’re finding aversive, and you can’t figure out why. Your goal is to procure a rain check.

  • Guess: *You see this annoyed body language? Huh? Look at it! If you don’t stop talking soon I swear I’ll start tapping my foot.* (Or, possibly, tell a little lie to excuse yourself. “Oh, look at the time…”) 
  • Ask: “Can we talk about this another time?”
  • Tell: "I'm beginning to find this conversation aversive, and I'm not sure why. I propose we hold off until I've figured that out."

Here are more examples from my own life:

  • "I didn't sleep well last night and am feeling frazzled and irritable today. I apologize if I snap at you during this meeting. It isn’t personal." 
  • "I just realized this interaction will be far more productive if my brain has food. I think we should head toward the kitchen." 
  • "It would be awfully convenient networking for me to stick around for a bit after our meeting to talk with you and [the next person you're meeting with]. But on a scale of one to ten, it's only about 3 useful to me. If you'd rate the loss of utility for you as two or higher, then I have a strong preference for not sticking around." 

The burden of honesty is even greater in Tell culture than in Ask culture. To a Guess culture person, I imagine much of the above sounds passive aggressive or manipulative, much worse than the rude bluntness of mere Ask. It’s because Guess people aren’t expecting relentless truth-telling, which is exactly what’s necessary here.

If you’re occasionally dishonest and tell people you want things you don't actually care about--like their comfort or convenience--they’ll learn not to trust you, and the inherent freedom of the system will be lost. They’ll learn that you only pretend to care about them to take advantage of their reciprocity instincts, when in fact you’ll count them as having defected if they respond by stating a preference for protecting their own interests.

Tell culture is cooperation with open source codes.

This kind of trust does not develop overnight. Here is the most useful Tell tactic I know of for developing that trust with a native Ask or Guess. It’s saved me sooooo much time and trouble, and I wish I’d thought of it earlier.

"I'm not asking because I expect you to say ‘yes’. I'm asking because I'm having trouble imagining the inside of your head, and I want to understand better. You are completely free to say ‘no’, or to tell me what you’re thinking right now, and I promise it will be fine." It is amazing how often people quickly stop looking shifty and say 'no' after this, or better yet begin to discuss further details.

On saving the world

82 So8res 30 January 2014 08:00PM

This is the final post in my productivity sequence.

The first post described what I achieved. The next three posts describe how. This post describes why, explaining the sources of my passion and the circumstances that convinced a young Nate to try and save the world. Within, you will find no suggestions, no techniques to emulate, no new ideas to ponder. This is a rationalist coming-of-age story. With luck, you may find it inspiring. Regardless, I hope you can learn from my mistakes.

Never fear, I'll be back to business soon — there's lots of studying to do. But before then, there's a story to tell, a memorial to what I left behind.

I was raised Catholic. On my eighth birthday, having received my first communion about a year prior, I casually asked my priest how to reaffirm my faith and do something for the Lord. The memory is fuzzy, but I think I donated a chunk of allowance money and made a public confession at the following mass.

A bunch of the grownups made a big deal out of it, as grownups are like to do. "Faith of a child", and all that. This confused me, especially when I realized that what I had done was rare. I wasn't trying to get pats on the head, I was appealing to the Lord of the Heavens and the Earth. Were we all on the same page, here? This was the creator. He was infinitely virtuous, and he had told us what to do.

And yet, everyone was content to recite hymns once a week and donate for the reconstruction of the church. What about the rest of the world, the sick, the dying? Where were the proselytizers, the missionary opportunities? Why was everyone just sitting around? 

On that day, I became acquainted with civilizational inadequacy. I realized you could hand a room full of people the literal word of God, and they'd still struggle to pay attention for an hour every weekend.

This didn't shake my faith, mind you. It didn't even occur to me that the grownups might not actually believe their tales. No, what I learned that day was that there are a lot of people who hold beliefs they aren't willing to act upon.

Eventually, my faith faded. The distrust remained.

continue reading »

2013 Survey Results

71 Yvain 19 January 2014 02:51AM

Thanks to everyone who took the 2013 Less Wrong Census/Survey. Extra thanks to Ozy, who helped me out with the data processing and statistics work, and to everyone who suggested questions.

This year's results are below. Some of them may make more sense in the context of the original survey questions, which can be seen here. Please do not try to take the survey as it is over and your results will not be counted.

continue reading »

Strategic choice of identity

67 Vika 08 March 2014 04:27PM

Identity is mostly discussed on LW in a cautionary manner: keep your identity small, be aware of the identities you are attached to. As benlandautaylor points out, identities are very powerful, and while being rightfully cautious about them, we can also cultivate them deliberately to help us achieve our goals.

Some helpful identities that I have that seem generally applicable:

  • growth mindset
  • low-hanging fruit picker
  • truth-seeker
  • jack-of-all trades (someone who is good at a variety of skills)
  • someone who tries new things
  • universal curiosity
  • mirror (someone who learns other people's skills)

Out of the above, the most useful is probably growth mindset, since it's effectively a meta-identity that allows the other parts of my identity to be fluid. The low-hanging fruit identity helps me be on the lookout for easy optimizations. The universal curiosity identity motivates me to try to understand various systems and fields of knowledge, besides the domains I'm already familiar with. It helps to give these playful or creative names, for example, "champion of low-hanging fruit". Some of these work well together, for example the "trying new things" identity contributes to the "jack of all trades" identity.

It's also important to identify unhelpful identities that get in your way. Negative identities can be vague like "lazy person" or specific like "someone who can't finish a project". With identities, just like with habits, the easiest way to reduce or eliminate a bad one seems to be to install a new one that is incompatible with it. For example, if you have a "shy person" identity, then going to parties or starting conversations with strangers can generate counterexamples for that identity, and help to displace it with a new one of "sociable person". Costly signaling can be used to achieve this - for example, joining a public speaking club. The old identity will not necessarily go away entirely, but the competing identity will create cognitive dissonance, which it can be useful to deliberately focus on. More specific identities require more specific counterexamples. Since the original negative identity makes it difficult to perform the actions that generate counterexamples, there needs to be some form of success spiral that starts with small steps.

Some examples of unhelpful identities I've had in the past were "person who doesn't waste things" and "person with poor intuition". The aversion to wasting money and material things predictably led to wasting time and attention instead. I found it useful to try "thinking like a trader" to counteract this "stingy person" identity, and get comfortable with the idea of trading money for time. Now I no longer obsess about recycling or buy the cheapest version of everything. Underconfidence in my intuition was likely responsible for my tendency to miss the forest for the trees when studying math or statistics, where I focused on details and missed the big picture ideas that are essential to actual understanding. My main objection to intuitions was that they feel imprecise, and I am trying to develop an identity of an "intuition wizard" who can manipulate concepts from a distance without zooming in. That is a cooler name than "someone who thinks about things without really understanding them", and brings to mind some people I know who have amazing intuition for math, which should help the identity stick.

There can also be ambiguously useful identities, for example I have a "tough person" identity, which motivates me to challenge myself and expand my comfort zone, but also increases self-criticism and self-neglect. Given the mixed effects, I'm not yet sure what to do about this one - maybe I can come up with an identity that only has the positive effects.

Which identities hold you back, and which ones propel you forward? If you managed to diminish negative identities, how did you do it and how far did you get?

Political Skills which Increase Income

56 Xodarap 02 March 2014 05:56PM

Summary: This article is intended for those who are "earning to give" (i.e. maximize income so that it can be donated to charity). It is basically an annotated bibliography of a few recent meta-analyses of predictors of income.

Key Results

  • The degree to which management “sponsors” your career development is an important predictor of your salary, as is how skilled you are politically.

  • Despite the stereotype of a silver-tongued salesman preying on people’s biases, rational appeals are generally the best tactic.

  • After rationality, the best tactics are types of ingratiation, including flattery and acting modest.

Ng et al. performed a metastudy of over 200 individual studies of objective and subjective career success. Here are the variables they found best correlated with salary:



Political Knowledge & Skills


Education Level


Cognitive Ability (as measured by standardized tests)




Training and Skill Development Opportunities


Hours Worked


Career Sponsorship


(all significant at p = .05)

(For reference, the “Big 5” personality traits all have a correlation under 0.12.)

Before we go on, a few caveats: while these correlations are significant and important, none are overwhelming (the authors cite Cohen as saying the range 0.24-0.36 is “medium” and correlations over 0.37 are “large”). Also, in addition to the usual correlation/causation concerns, there is lots of cross-correlation: e.g. older people might have greater political knowledge but less education, thereby confusing things. For a discussion of moderating variables, see the paper itself.

Career Sponsorship

There are two broad models of career advancement: contest-mobility and sponsorship-mobility. They are best illustrated with an example.

Suppose Peter and Penelope are both equally talented entry-level employees. Under the contest-mobility model, they would both be equally likely to get a raise or promotion, because they are equally skilled.

Sponsorship-mobility theorists argue that even if Peter and Penelope are equally talented, it’s likely that one of them will catch the eye of senior management. Perhaps it’s due to one of them having an early success by chance, making a joke in a meeting, or simply just having a more memorable name, like Penelope. This person will be singled out for additional training and job opportunities. Because of this, they’ll have greater success in the company, which will lead to more opportunities etc. As a result, their initial small discrepancy in attention gets multiplied into a large differential.

The authors of the metastudy found that self-reported sponsorship levels (i.e. how much you feel the management of your company “sponsors” you) have a significant, although moderate, relationship to salary. Therefore, the level at which you currently feel sponsored in your job should be a factor when you consider alternate opportunities.

The Dilbert Effect

The strongest predictor of salary (tied with education level) is what the authors politely term “Political Knowledge & Skills” - less politely, how good you are at manipulating others.

Several popular books (such as Cialdini’s Influence) on the subject of influencing others exist, and the study of these “influence tactics” in business stretches back 30 years to Kipnis, Schmidt and Wilkinson. Recently, Higgins et al. reviewed 23 individual studies of these tactics and how they relate to career success. Their results:



Definition (From Higgins et al.)



Using data and information to make a logical argument supporting one's request



Using behaviors designed to increase the target's liking of oneself or to make oneself appear friendly in order to get what one wants

Upward Appeal


Relying on the chain of command, calling in superiors to help get one's way



Attempting to create an appearance of competence or that you are capable

of completing a task



Using a forceful manner to get what one wants



Making an explicit offer to do something for another in exchange for their doing what

one wants

(Only ingratiation and rationality are significant.)

This site has a lot of information on how to make rational appeals, so I will focus on the less-talked-about ingratiation techniques.

How to be Ingratiating

Gordon analyzed 69 studies of ingratiation and found the following. (Unlike the previous two sections, success here is measured in lab tests as well as in career advancement. However, similar but less comprehensive results have been found in terms of career success):


Weighted Effectiveness (Cohen’s d difference between control and intervention)


Other Enhancement



Opinion Conformity


“Go along to get along”



Any of the following tactics: Self-promotion, self-deprecation, apologies, positive nonverbal displays and name usage



Includes studies where the participants weren’t told which strategy to use, in addition to when they were instructed to use multiple strategies

Rendering Favors


Self-presentation is split further:


Weighted Effect Size






Apologizing for poor performance



When the participant is told in generic terms to improve their self-presentation

Nonverbal behavior and name usage


Nonverbal behavior includes things like wearing perfume. Name usage means referring to people by name instead of a pronoun.






One important moderator is the direction of the appeal. If you are talking to your boss, your tactics should be different than if you’re talking to a subordinate. Other-enhancement (flattery) is always the best tactic no matter who you’re talking to, but when talking to superiors it’s by far the best. When talking to those at similar levels to you, opinion conformity comes close to flattery, and the other techniques aren't far behind.

Unsurprisingly, when the target realizes you’re being ingratiating, the tactic is less effective. (Although effectiveness doesn’t go to zero - even when people realize you’re flattering them just to suck up, they generally still appreciate it.) Also, women are better at being ingratiating than men, and men are more influenced by these ingratiating tactics than women. The most important caveat is that lab studies find much larger effect sizes than in the field, to the extent that the average field effect for the ingratiating tactics is negative. This is probably due to the fact that lab experiments can be better controlled.


It’s unlikely that a silver-tongued receptionist will out-earn an introverted engineer. But simple techniques like flattery and attempting to get "sponsored" can appreciably improve returns, to the extent that political skills are one of the strongest predictors of salaries.


I would like to thank Brian Tomasik and Gina Stuessy for reading early drafts of this article.


Cohen, Jacob. Statistical power analysis for the behavioral sciences. Psychology Press, 1988.


Gordon, Randall A. "Impact of ingratiation on judgments and evaluations: A meta-analytic investigation." Journal of Personality and Social Psychology 71.1 (1996): 54.


Higgins, Chad A., Timothy A. Judge, and Gerald R. Ferris. "Influence tactics and work outcomes: a meta‐analysis." Journal of Organizational Behavior 24.1 (2003): 89-106.


Judge, Timothy A., and Robert D. Bretz Jr. "Political influence behavior and career success." Journal of Management 20.1 (1994): 43-65.


Kipnis, David, Stuart M. Schmidt, and Ian Wilkinson. "Intraorganizational influence tactics: Explorations in getting one's way." Journal of Applied psychology 65.4 (1980): 440.

Ng, Thomas WH, et al. "Predictors of objective and subjective career success: A meta‐analysis." Personnel psychology 58.2 (2005): 367-408.

Less Wrong Study Hall - Year 1 Retrospective

46 Error 18 March 2014 01:54AM

Some time back, a small group of Less Wrongers collected in a video chatroom to work on…things. We’ve been at it for exactly one year as of today, and it seems like a good time to see what’s come of it.[1] So here is what we’ve done, what we’re doing, and a few thoughts on where we’re going. At the end is a survey taken of the LWSH, partly to be compared to Less Wrong proper, but mostly for fun. If you like what you see here, come join us. The password is “lw”.

A Brief History of the Hall

I think the first inspiration was Eliezer looking for someone to sit with him while he worked, to help with productivity and akrasia. Shannon Friedman answered the call and it seemed to be effective. She suggested a similar coworking scheme to one of her clients, Mqrius, to help him with akratic issues surrounding his thesis. She posted on Less Wrong about it, with the intent of connecting him and possibly others who wanted to co-work in a similar fashion. Tsakinis, in the comments, took the idea a step further, and created a Tinychat video chatroom for group working. It was titled the Less Wrong Study Hall. The theory is that it will help us actually do the work, instead of, say, reading tvtropes when we should be studying. It turned out to be a decent Schelling point, enough to form a regular group and occasionally attract new people. It’s grown slowly but steadily.

Tinychat’s software sucks, and there have been a couple of efforts to replace it. Mqrius looked into OpenMeetings, but it didn’t work out. Yours truly took a crack at programming a LWSH Google Hangout, but it ran aground on technical difficulties. Meanwhile the tinychat room continued to work, and despite nobody actually liking it, it’s done the job well enough.

Tinychat is publicly available, and there have been occasional issues with the public along the way. A few people took up modding, but it was still a nuisance. Eventually a password was placed on the room, which mostly shut down the problem. We did have one guy straight out guess the password, which was a…peculiar experience. He was notably not all there, but somehow still scrupulously polite, and left when asked. I don’t think I’ve ever seen that happen on the Internet before.

A year after the Hall opened, we have about twenty to twenty-five regulars, with an unknown number of occasional users. We’re still well within Dunbar’s number, so everybody knows everybody else and new users integrate quickly. We’ve developed a reasonably firm set of social norms to guide our work, in spite of not having direct technical control nor clear leaders.

continue reading »

Self-Congratulatory Rationalism

43 ChrisHallquist 01 March 2014 08:52AM

Quite a few people complain about the atheist/skeptic/rationalist communities being self-congratulatory. I used to dismiss this as a sign of people's unwillingness to admit that rejecting religion, or astrology, or whatever, was any more rational than accepting those things. Lately, though, I've started to worry.

Frankly, there seem to be a lot of people in the LessWrong community who imagine themselves to be, not just more rational than average, but paragons of rationality who other people should accept as such. I've encountered people talking as if it's ridiculous to suggest they might sometimes respond badly to being told the truth about certain subjects. I've encountered people asserting the rational superiority of themselves and others in the community for flimsy reasons, or no reason at all.

Yet the readiness of members of the LessWrong community to disagree with and criticize each other suggests we don't actually think all that highly of each other's rationality. The fact that members of the LessWrong community tend to be smart is no guarantee that they will be rational. And we have much reason to fear "rationality" degenerating into signaling games.

continue reading »

A Fervent Defense of Frequentist Statistics

43 jsteinhardt 18 February 2014 08:08PM

[Highlights for the busy: de-bunking standard "Bayes is optimal" arguments; frequentist Solomonoff induction; and a description of the online learning framework. Note: cross-posted from my blog.]

Short summary. This essay makes many points, each of which I think is worth reading, but if you are only going to understand one point I think it should be “Myth 5″ below, which describes the online learning framework as a response to the claim that frequentist methods need to make strong modeling assumptions. Among other things, online learning allows me to perform the following remarkable feat: if I’m betting on horses, and I get to place bets after watching other people bet but before seeing which horse wins the race, then I can guarantee that after a relatively small number of races, I will do almost as well overall as the best other person, even if the number of other people is very large (say, 1 billion), and their performance is correlated in complicated ways.

If you’re only going to understand two points, then also read about the frequentist version of Solomonoff induction, which is described in “Myth 6″.

Main article. I’ve already written one essay on Bayesian vs. frequentist statistics. In that essay, I argued for a balanced, pragmatic approach in which we think of the two families of methods as a collection of tools to be used as appropriate. Since I’m currently feeling contrarian, this essay will be far less balanced and will argue explicitly against Bayesian methods and in favor of frequentist methods. I hope this will be forgiven as so much other writing goes in the opposite direction of unabashedly defending Bayes. I should note that this essay is partially inspired by some of Cosma Shalizi’s blog posts, such as this one.

This essay will start by listing a series of myths, then debunk them one-by-one. My main motivation for this is that Bayesian approaches seem to be highly popularized, to the point that one may get the impression that they are the uncontroversially superior method of doing statistics. I actually think the opposite is true: I think most statisticans would for the most part defend frequentist methods, although there are also many departments that are decidedly Bayesian (e.g. many places in England, as well as some U.S. universities like Columbia). I have a lot of respect for many of the people at these universities, such as Andrew Gelman and Philip Dawid, but I worry that many of the other proponents of Bayes (most of them non-statisticians) tend to oversell Bayesian methods or undersell alternative methodologies.

If you are like me from, say, two years ago, you are firmly convinced that Bayesian methods are superior and that you have knockdown arguments in favor of this. If this is the case, then I hope this essay will give you an experience that I myself found life-altering: the experience of having a way of thinking that seemed unquestionably true slowly dissolve into just one of many imperfect models of reality. This experience helped me gain more explicit appreciation for the skill of viewing the world from many different angles, and of distinguishing between a very successful paradigm and reality.

continue reading »

View more: Next