Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
About three and a half years ago, orthonormal ran an akrasia tactics review: an open-ended survey asking Less Wrong posters to give numerical scores to productivity techniques that they'd tried, with the goal of getting a more objective picture of how well different techniques work (for the sort of people who post here). Since it's been years since the original and Less Wrong has grown significantly while retaining akrasia as a major topic, I thought it'd be useful to have a new one!
A modified version of the instructions from the previous post:
- Note what technique you've tried. Techniques can be anything from productivity systems (Getting Things Done) to social incentives (precommitting in front of friends) to websites or computer programs (Beeminder, Leechblock) to chemical aids (Modafinil). If it's something that you can easily link to information about, please provide a link and I'll add it when I list the technique; if you don't have a link, describe it in your comment and I'll link that.
- Give your experience with it a score from -10 to +10 (0 if it didn't change the status quo, 10 if it ended your akrasia problems forever with no side effects, negative scores if it actually made your life worse, -10 if it nearly killed you). For simplicity's sake, I'll only include reviews that give numerical scores.
- Describe your experience with it, including any significant side effects. Please also say approximately how long you've been using it, or if you don't use it anymore how long you used it before giving up.
Every so often, I'll combine all the data back into the main post, listing every technique that's been reviewed at least twice with the number of reviews, average score, standard deviation and common effects, as well as links to the relevant reviews <edit: mostly canceling the last two parts part because I think it'd be too much work for me for too little benefit for the reader>. I'll do my best to combine similar techniques appropriately, but it'd be appreciated if you could try to organize it a bit by replying to people doing similar things and/or saying if you feel your technique is (dis)similar to another.
I'm not going to provide an initial list due to the massive number of possible techniques and fear of prejudicing answers, but you can look back on the list in the last post if you want. If you have any suggestions for how to organize this (that wouldn't require huge amounts of extra effort on my part), I'm open to hearing them.
Thanks for your data!
Updated through 7/23/13. Organizing these turned out to be a lot harder than I expected and involved a lot of subjective categorization, so consult the primary sources.
Beeminder: +5.3 (SD 1.8). Details of how it's used vary a lot.
Getting Things Done (GTD): +2.8 (SD 4.0). A very broad and modular system, with opinions differing on different parts.
Remember The Milk:+5.5 (SD 3.0). Frequently mentioned in conjunction with GTD.
Pomodoros: +4.5 (SD 2.5).
Scheduling: +4.7 (SD 3.7)
Leechblock: +3.0 (SD 0.8)
Social precommitment: +0.7 (SD 2.6)
Unaided self-reinforcement: +0.7 (SD 0.9)
Trello: +5.0 (SD 3.0)
HabitRPG: +4.5 (SD 0.5)
LW Study Hall: +4 (SD 3.0)
The Center for Applied Rationality is running two more four-day workshops: Jan 25-28 and March 1-4 in the SF bay area. Like the previous workshop, these sessions are targeted at ambitious, analytic people who have broad intellectual interests, and who care about making real-world projects work. Less Wrong veterans and Less Wrong newcomers alike are welcome: as discussed below, we are intentionally bringing together folks with varied backgrounds and skill bases.
CFAR is taking LW-style rationality into the world, this month, with a new kind of rationality camp: Rationality for Entrepreneurs. It is aimed at ambitious, relatively successful folk (regardless of whether they are familiar with LW), who like analytic thinking and care about making practical real-world projects work. Some will be paying for themselves; others will be covered by their companies.
If you'd like to learn rationality in a more practical context, consider applying. Also, if you were hoping to introduce rationality and related ideas to a friend/acquaintance who fits the bill, please talk to them about the workshop, both for their sake and to strengthen the rationality community.
The price will be out of reach for some: the workshop costs $3.9k. But there is a money-back guarantee. Some partial scholarships may be available. This fee buys participants:
- Four nights and three days at a retreat center, with small classes, interactive exercises, and much opportunity for unstructured conversation that applies the material at meals and during the evenings (room and board is included);
- One instructor for every three participants;
- Six weeks of Skype/phone and email follow-up, to help participants make the material into regular habits, and navigate real-life business and personal situations with these tools.
CFAR is planning future camps which are more directly targeted at a Less Wrong audience (like our previous camps), so don’t worry if this camp doesn’t seem like the right fit for you (because of cost, interests, etc.). There will be others. But if you or someone you know does have an entrepreneurial bent, then we strongly recommend applying to this camp rather than waiting. Attendees will be surrounded by other ambitious, successful, practically-minded folks, learn from materials that have been tailored to entrepreneurial issues, and receive extensive follow-up to help apply what they’ve learned to their businesses and personal lives.
Our schedule is below.
(See also the thread about the camp on Hacker News.)
The following excerpts are from “Does philosophy improve critical thinking skills?”, Ortiz 2007.
This thesis makes a first attempt to subject the assumption that studying [Anglo-American analytic] philosophy improves critical thinking skills to rigorous investigation.
…Thus the second task, in Chapter 3, is to articulate and critically examine the standard arguments that are raised in support of the assumption (or rather, would be raised if philosophers were in the habit of providing support for the assumption). These arguments are found to be too weak to establish the truth of the assumption. The failure of the standard arguments leaves open the question of whether the assumption is in fact true. The thesis argues at this point that, since the assumption is making an empirical assertion, it should be investigated using standard empirical techniques as developed in the social sciences. In Chapter 4, I conduct an informal review of the empirical literature. The review finds that evidence from the existing empirical literature is inconclusive. Chapter 5 presents the empirical core of the thesis. I use the technique of meta-analysis to integrate data from a large number of empirical studies. This meta-analysis gives us the best yet fix on the extent to which critical thinking skills improve over a semester of studying philosophy, general university study, and studying critical thinking. The meta-analysis results indicate that students do improve while studying philosophy, and apparently more so than general university students, though we cannot be very confident that this difference is not just the result of random variation. More importantly, studying philosophy is less effective than studying critical thinking, regardless of whether one is being taught in a philosophy department or in some other department. Finally, studying philosophy is much less effective than studying critical thinking using techniques known to be particularly effective such as LAMP.
Simon is writing a calculus textbook. Since there are a lot of textbooks on the market, he wants to make his distinctive by including a lot of original examples. To do this, he decides to first check what sorts of examples are in some of the other books, and then make sure to avoid those. Unfortunately, after skimming through several other books, he finds himself completely unable to think of original examples—his mind keeps returning to the examples he's just read instead of coming up with new ones.
What he's experiencing here is another aspect of priming or anchoring. The way it appears to happen in my brain is that it decides to anchor on the examples it's already seen and explore the idea-space from there, moving from an idea only to ideas that are closely related to it (similarly to a depth-first search)
At first, this search strategy might not seem so bad—in fact, it's ideal if there is one best solution and the closer you get to it the better. For example, if you were shooting arrows at a target, all you'd need to consider is how close to the center you can hit. Where we run into problems, however, is trying to come up with multiple solutions (such as multiple examples of the applications of calculus), or trying to come up with the best solution when there are many plausible solutions. In these cases, our brain's default search algorithm will often grab the first idea it can think of and try to refine it, even if what we really need is a completely different idea.
“I do not say this lightly... but if you're looking for superpowers, this is the place to start.”
--Michael Curzi, summer 2011 minicamp participant
Who: You and a class full of other aspiring rationalists and world-optimizers, from around the world.
What: Two 3-day weekend minicamps and one 8-day minicamp, filled with hands-on activities for applying rationality to your life, your goals, and the making of a better world. (See details in the FAQ.)
When and where: We're running three camps, so that we can do this for three sets of participants: May 11-13 and June 22-24 for the 3-day camps, and July 21-28 for the eight-day camp, all in the San Francisco Bay Area.
Why: Because you’re a social primate, and the best way to jump into a new way of thinking, make friends, and accomplish your goals is often to spend time with other primates who are doing just that.
- Hang out and explore the Bay Area with two dozen other people like you who are smart, interesting, and passionate about rationality
- Attend bonus sessions about style, body language, and confidence-building.
- Get help charting out career paths; and, entirely optionally for those interested, connect with folks at the Singularity Institute about optimal philanthropy.
|Eliezer Yudkowsky||Anna Salamon||Julia Galef|
|Andrew Critch||Luke Muehlhauser||Michael Smith|
Cost: $650 for the three-day programs; $1500 for the week-long program. This includes lodging, meals, and tuition.
(Note that this *still* isn't quite enough to make running minicamps sustainable in the long-run; a lodging + meals at retreat centers start at around $90 per person per night, the "three-day camps" include four nights, and these workshops take a staff of about 5 full-time people for over a month each prior to each workshop, most of us at $3k/month, counting curriculum development time (plus miscellaneous expenses). We are trying to strike a compromise between "charge enough that we can run more camps" and staying affordable, especially for our start-up phase; costs will probably go up in following years.)
Three days (or a week) isn’t long enough to learn rationality, but it's long enough to learn how to learn rationality, and to get some momentum toward doing so.
Come meet us, and see what you can do.
A couple of weeks ago, I was suffering from insomnia. Eventually my inability to fall asleep turned into frustration, which then led to feelings of self-doubt about my life in general. Soon I was wondering about whether I would ever amount to anything, whether any of my various projects would ever end up bearing fruit, and so forth. As usual, I quickly became convinced that my life prospects were dim, and that I should stop being ambitious and settle for some boring but safe path while I still had the chance.
Then I realized that there was no reason for me to believe in this, and I stopped thinking that way. I still felt frustrated about not being able to sleep, but I didn't feel miserable about my chances in life. To do otherwise would have been to misinterpret my emotions.
Let me explain what I mean by that. There are two common stereotypes about the role of emotions. The first says that emotions are something irrational, and should be completely disregarded when making decisions. The second says that emotions are basically always right, and one should follow their emotions above all. Psychological research on emotions suggests that the correct answer lies in between: we have emotions for a reason, and we should follow their advice, but not unthinkingly.
The Information Principle says that emotional feelings provide conscious information from unconscious appraisals of situations1. Your brain is constantly appraising the situation you happen to be in. It notes things like a passerby having slightly threatening body language, or conversation with some person being easy and free of misunderstandings. There are countless of such evaluations going on all the time, and you aren't consciously aware of them because you don't need to. Your subconscious mind can handle them just fine on its own. The end result of all those evaluations is packaged into a brief summary, which is the only thing that your conscious mind sees directly. That "executive summary" is what you experience as a particular emotional state. The passerby makes you feel slightly nervous and you avoid her, or your conversational partner feels pleasant to talk with and you begin to like him, even though you don't know why.
To some extent, then, your emotions will guide you to act appropriately in various situations, even when you don't know why you feel the way you do. However, it's important to intepret them correctly. Maybe you meet a new person on a good day and feel good when talking with them. Do you feel good because the person is pleasant to be with, or because the weather is pleasant? In general, emotions are only used as a source of information when their informational value is not called into question2. If you know that you are sad because of something that happened in the morning, and still feel sad when talking to your friend later on, you don't assume that something about your friend is making you feel sad.
People also pay more attention to their feelings when they think them relevant for the question at hand. For example, moods have a larger impact when people are making decisions for themselves rather than others, who may experience things differently. But by default, people tend to assume that their feelings and emotions are "about" whatever it is that they're thinking about at that moment. If they're not given a reason to presume that their emotions are caused by something else than the issue at hand, they don't.2
As I've been reading through various articles and their comments on Less Wrong, I've noticed a theme that has appeared repeatedly: a frustration that we are not seeing more practical benefits from studying rationality. For example, Eliezer writes in A Sense that More Is Possible,
Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people...
Yvain writes in Extreme Rationality: It's Not That Great,
...I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines, I can't think of any.
patrissimo wrote in a comment on another article,
Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization.
These writers have also offered some suggestions for improving the situation. Eliezer writes,
Of this [question] there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.
patrissimo describes what he thinks an effective rationality practice would look like.
- It is a group of people who gather in person to train specific skills.
- While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
- Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure). As a result, it is driven by the needs of the learners [emphasis added].
- You have to sweat, but the result is you get stronger.
- You improve by learning from those better than you, competing with those at your level, and teaching those below you.
- It is run by a professional, or at least someone getting paid [emphasis added] for their hobby. The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.
Dan Nuffer and I have decided that it's time to stop talking and start doing. We are in the very early stages of creating a business to help people improve their lives by training them in instrumental rationality. We've done some preliminary market research to get an idea of where the opportunities might lie. In fact, this venture got started when, on a whim, I ran a poll on ask500people.com asking,
Would you pay $75 for an interactive online course teaching effective decision-making skills?
I got 299 responses in total. These are the numbers that responded with "likely" or "very likely":
- 23.4% (62) overall.
- 49% (49 of 100) of the respondents from India.
- 10.6% (21 of 199) of the respondents not from India.
- 9.0% (8 of 89) of the respondents from the U.S.
These numbers were much higher than I expected, especially the numbers from India, which still puzzle me. Googling around a bit, though, I found an instructor-led online decision-making course for $130, and a one-day decision-making workshop offered in the UK for £200 (over $350)... and the Google keyword tool returns a large number of search terms (800) related to "decision-making", many of them with a high number of monthly searches.
So it appears that there may be a market for training in effective decision-making -- something that could be the first step towards a more comprehensive training program in instrumental rationality. Some obvious market segments to consider are business decision makers, small business owners, and intelligent people of an analytical bent (e.g., the kind of people who find Less Wrong interesting). An important subset of this last group are INTJ personality types; I don't know if there is an effective way to find and market to specific Meyers-Briggs personality types, but I'm looking into it.
"Life coaching" is a proven business, and its growing popularity suggests the potential for a "decision coaching" service; in fact, helping people with big decisions is one of the things a life coach does. One life coach of 12 years described a typical client as age 35 to 55, who is "at a crossroads, must make a decision and is sick of choosing out of safety and fear." Life coaches working with individuals typically charge around $100 to $300 per hour. As far as I can tell, training in decision analysis / instrumental rationality is not commonly found among life coaches. Surely we can do better.
Can we do effective training online? patrissimo thinks that gathering in person is necessary, but I'm not so sure. His evidence is that "all the people who have replied to me so far saying they get useful rationality practice out of the LW community said the growth came through attending local meetups." To me this is weak evidence -- it seems to say more about the effectiveness of local meetups vs. just reading about rationality. In any event, it's worth testing whether online training can work, since
- not everyone can go to meetups,
- it should be easier to scale up, and
- not to put too fine a point on it, but online training is probably more profitable.
To conclude, one of the things an entrepreneur needs to do is "get out of the building" and talk to members of the target market. We're interested in hearing what you think. What ideas do you think would be most effective in training for instrumental rationality, and why? What would you personally want from a rationality training program? What kinds of products / services related to rationality training would you be interesting in buying?
Summary: medical progress has been much slower than even recently predicted.
In the February and March 1988 issues of Cryonics, Mike Darwin (Wikipedia/LessWrong) and Steve Harris published a two-part article “The Future of Medicine” attempting to forecast the medical state of the art for 2008. Darwin has republished it on the New_Cryonet email list.
Darwin is a pretty savvy forecaster (who you will remember correctly predicting in 1981 in “The High Cost of Cryonics”/part 2 ALCOR’s recent troubles with grandfathering), so given my standing interests in tracking predictions, I read it with great interest; but they still blew most of them, and not the ones we would prefer them to’ve.
The full essay is ~10k words, so I will excerpt roughly half of it below; feel free to skip to the reactions section and other links.
View more: Next