Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
The Center for Applied Rationality is running two more four-day workshops: Jan 25-28 and March 1-4 in the SF bay area. Like the previous workshop, these sessions are targeted at ambitious, analytic people who have broad intellectual interests, and who care about making real-world projects work. Less Wrong veterans and Less Wrong newcomers alike are welcome: as discussed below, we are intentionally bringing together folks with varied backgrounds and skill bases.
The following excerpts are from “Does philosophy improve critical thinking skills?”, Ortiz 2007.
This thesis makes a first attempt to subject the assumption that studying [Anglo-American analytic] philosophy improves critical thinking skills to rigorous investigation.
…Thus the second task, in Chapter 3, is to articulate and critically examine the standard arguments that are raised in support of the assumption (or rather, would be raised if philosophers were in the habit of providing support for the assumption). These arguments are found to be too weak to establish the truth of the assumption. The failure of the standard arguments leaves open the question of whether the assumption is in fact true. The thesis argues at this point that, since the assumption is making an empirical assertion, it should be investigated using standard empirical techniques as developed in the social sciences. In Chapter 4, I conduct an informal review of the empirical literature. The review finds that evidence from the existing empirical literature is inconclusive. Chapter 5 presents the empirical core of the thesis. I use the technique of meta-analysis to integrate data from a large number of empirical studies. This meta-analysis gives us the best yet fix on the extent to which critical thinking skills improve over a semester of studying philosophy, general university study, and studying critical thinking. The meta-analysis results indicate that students do improve while studying philosophy, and apparently more so than general university students, though we cannot be very confident that this difference is not just the result of random variation. More importantly, studying philosophy is less effective than studying critical thinking, regardless of whether one is being taught in a philosophy department or in some other department. Finally, studying philosophy is much less effective than studying critical thinking using techniques known to be particularly effective such as LAMP.
“I do not say this lightly... but if you're looking for superpowers, this is the place to start.”
--Michael Curzi, summer 2011 minicamp participant
Who: You and a class full of other aspiring rationalists and world-optimizers, from around the world.
What: Two 3-day weekend minicamps and one 8-day minicamp, filled with hands-on activities for applying rationality to your life, your goals, and the making of a better world. (See details in the FAQ.)
When and where: We're running three camps, so that we can do this for three sets of participants: May 11-13 and June 22-24 for the 3-day camps, and July 21-28 for the eight-day camp, all in the San Francisco Bay Area.
Why: Because you’re a social primate, and the best way to jump into a new way of thinking, make friends, and accomplish your goals is often to spend time with other primates who are doing just that.
- Hang out and explore the Bay Area with two dozen other people like you who are smart, interesting, and passionate about rationality
- Attend bonus sessions about style, body language, and confidence-building.
- Get help charting out career paths; and, entirely optionally for those interested, connect with folks at the Singularity Institute about optimal philanthropy.
|Eliezer Yudkowsky||Anna Salamon||Julia Galef|
|Andrew Critch||Luke Muehlhauser||Michael Smith|
Cost: $650 for the three-day programs; $1500 for the week-long program. This includes lodging, meals, and tuition.
(Note that this *still* isn't quite enough to make running minicamps sustainable in the long-run; a lodging + meals at retreat centers start at around $90 per person per night, the "three-day camps" include four nights, and these workshops take a staff of about 5 full-time people for over a month each prior to each workshop, most of us at $3k/month, counting curriculum development time (plus miscellaneous expenses). We are trying to strike a compromise between "charge enough that we can run more camps" and staying affordable, especially for our start-up phase; costs will probably go up in following years.)
Three days (or a week) isn’t long enough to learn rationality, but it's long enough to learn how to learn rationality, and to get some momentum toward doing so.
Come meet us, and see what you can do.
Recently, Portland Lesswrong played a game that was a perfect trifecta of: difficult mental exercise; fun; and an opportunity to learn about biases and recognize them in yourself and others. We're still perfecting it, and we'd welcome feedback, especially from people who try it.
The Short Version
The game is a combination of Pandemic, a cooperative board game that is cognitively demanding, and the idea of roleplaying cognitive biases. Our favorite way of playing it (so far), everyone selects a bias at random, and then attempts to exaggerate that bias in their arguments and decisions during the game. Everyone attempts to identify the biases in the other players, and, when a bias is guessed, the guessed player selects a new bias and begins again.
As I've been reading through various articles and their comments on Less Wrong, I've noticed a theme that has appeared repeatedly: a frustration that we are not seeing more practical benefits from studying rationality. For example, Eliezer writes in A Sense that More Is Possible,
Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people...
Yvain writes in Extreme Rationality: It's Not That Great,
...I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines, I can't think of any.
patrissimo wrote in a comment on another article,
Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization.
These writers have also offered some suggestions for improving the situation. Eliezer writes,
Of this [question] there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.
patrissimo describes what he thinks an effective rationality practice would look like.
- It is a group of people who gather in person to train specific skills.
- While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
- Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure). As a result, it is driven by the needs of the learners [emphasis added].
- You have to sweat, but the result is you get stronger.
- You improve by learning from those better than you, competing with those at your level, and teaching those below you.
- It is run by a professional, or at least someone getting paid [emphasis added] for their hobby. The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.
Dan Nuffer and I have decided that it's time to stop talking and start doing. We are in the very early stages of creating a business to help people improve their lives by training them in instrumental rationality. We've done some preliminary market research to get an idea of where the opportunities might lie. In fact, this venture got started when, on a whim, I ran a poll on ask500people.com asking,
Would you pay $75 for an interactive online course teaching effective decision-making skills?
I got 299 responses in total. These are the numbers that responded with "likely" or "very likely":
- 23.4% (62) overall.
- 49% (49 of 100) of the respondents from India.
- 10.6% (21 of 199) of the respondents not from India.
- 9.0% (8 of 89) of the respondents from the U.S.
These numbers were much higher than I expected, especially the numbers from India, which still puzzle me. Googling around a bit, though, I found an instructor-led online decision-making course for $130, and a one-day decision-making workshop offered in the UK for £200 (over $350)... and the Google keyword tool returns a large number of search terms (800) related to "decision-making", many of them with a high number of monthly searches.
So it appears that there may be a market for training in effective decision-making -- something that could be the first step towards a more comprehensive training program in instrumental rationality. Some obvious market segments to consider are business decision makers, small business owners, and intelligent people of an analytical bent (e.g., the kind of people who find Less Wrong interesting). An important subset of this last group are INTJ personality types; I don't know if there is an effective way to find and market to specific Meyers-Briggs personality types, but I'm looking into it.
"Life coaching" is a proven business, and its growing popularity suggests the potential for a "decision coaching" service; in fact, helping people with big decisions is one of the things a life coach does. One life coach of 12 years described a typical client as age 35 to 55, who is "at a crossroads, must make a decision and is sick of choosing out of safety and fear." Life coaches working with individuals typically charge around $100 to $300 per hour. As far as I can tell, training in decision analysis / instrumental rationality is not commonly found among life coaches. Surely we can do better.
Can we do effective training online? patrissimo thinks that gathering in person is necessary, but I'm not so sure. His evidence is that "all the people who have replied to me so far saying they get useful rationality practice out of the LW community said the growth came through attending local meetups." To me this is weak evidence -- it seems to say more about the effectiveness of local meetups vs. just reading about rationality. In any event, it's worth testing whether online training can work, since
- not everyone can go to meetups,
- it should be easier to scale up, and
- not to put too fine a point on it, but online training is probably more profitable.
To conclude, one of the things an entrepreneur needs to do is "get out of the building" and talk to members of the target market. We're interested in hearing what you think. What ideas do you think would be most effective in training for instrumental rationality, and why? What would you personally want from a rationality training program? What kinds of products / services related to rationality training would you be interesting in buying?
Recent brainstorming sessions at SIAI (with participants including Anna, Carl, Jasen, Divia, Will, Amy Willey, and Andrew Critch) have started to produce lists of rationality skills that we could potentially try to teach (at Rationality Boot Camp, at Less Wrong meetups, or similar venues). We've also been trying to break those skills down to the 5-second level (step 2) and come up with ideas for exercises that might teach them (step 3) although we haven't actually composed those exercises yet (step 4, where the actual work takes place).
The bulk of this post will mainly go into the comments, which I'll try to keep to the following format: A top-level comment is a major or minor skill to teach; upvote this comment if you think this skill should get priority in teaching. Sub-level comments describe 5-second subskills that go into this skill, and then third-level comments are ideas for exercises which could potentially train that 5-second skill. If anyone actually went to the work of composing a specific exercise people could run through, that would go to the fourth-level of commenting, I guess. For some major practicable arts with a known standard learning format like "Improv" or "Acting", I'll put the exercise at the top and guesses at which skills it might teach below. (And any plain old replies can go at any level.)
I probably won't be able to get to all of what we brainstormed today, so here's a PNG of the Freemind map that I generated during our session.
This article aims to prove that Ace Attorney is possibly the first rationalist game in the lesswrongian sense, or at least a remarkable proto-example, and that it subliminally works to raise the sanity waterline in the general population, and might provide a template on which to base future works that aim to achieve a similar effect.
The Ace Attorney series of games for the Nintendo DS console puts you in the shoes of Phoenix Wright, an attorney who, in the vein of Perry Mason, takes on difficult cases to defend his clients from a judicial system that is heavily inspired by that of Japan, in which the odds are so stacked against the defense it's practically a Kangaroo Court where your clients are guilty until proven innocent.
For those unfamiliar with the game, and those who want to explore the "social criticism" aspect of the game, I wholeheartedly recommend this most excellent article from The Escapist. Now that that's out of the way, we can move on to what makes this relevant for Less Wrong. What makes this game uniquely interesting from a Rationalist POV is that the entire game mechanics are based on
- gathering material evidence
- finding the factual contradictions in the witnesses' testimonies
- using the evidence to bust the lies open and force the truth out
What's damaging about moralizing that we wish to avoid, what useful purpose does moralizing usually serve, and what allows to avoid the damage while retaining the usefulness? It engages psychological adaptations that promote conflict (by playing on social status), which are unpleasant to experience and can lead to undesirable consequences in the long run (such as feeling systematically uncomfortable interacting with a person, and so not being able to live or work or be friends with them). It serves the purpose of imprinting your values, which you feel to be right, on the people you interact with. Consequentialist elucidation of reasons for approving or disapproving of a given policy (virtue) is an effective persuasion technique if your values are actually right (for the people you try to confer them on), and it doesn't engage the same parts of your brain that make moralizing undesirable.
What happens here is transfer of responsibility for important tasks from the imperfect machinery that historically used to manage them (with systematic problems in any given context that humans but not evolution can notice), to explicit reasoning.
To develop methods of teaching rationality skills, you need to learn to focus on mental events that occur in 5 seconds or less. Most of what you want to teach is directly on this level; the rest consists of chaining together skills on this level.
As our first example, let's take the vital rationalist skill, "Be specific."
Even with people who've had moderate amounts of exposure to Less Wrong, a fair amount of my helping them think effectively often consists of my saying, "Can you give me a specific example of that?" or "Can you be more concrete?"
A couple of formative childhood readings that taught me to be specific:
"What is meant by the word red?"
"It's a color."
"What's a color?"
"Why, it's a quality things have."
"What's a quality?"
"Say, what are you trying to do, anyway?"
You have pushed him into the clouds. If, on the other hand, we habitually go down the abstraction ladder to lower levels of abstraction when we are asked the meaning of a word, we are less likely to get lost in verbal mazes; we will tend to "have our feet on the ground" and know what we are talking about. This habit displays itself in an answer such as this:
"What is meant by the word red?"
"Well, the next time you see some cars stopped at an intersection, look at the traffic light facing them. Also, you might go to the fire department and see how their trucks are painted."
-- S. I. Hayakawa, Language in Thought and Action
"Beware, demon!" he intoned hollowly. "I am not without defenses."
"Oh yeah? Name three."
-- Robert Asprin, Another Fine Myth
And now, no sooner does someone tell me that they want to "facilitate communications between managers and employees" than I say, "Can you give me a concrete example of how you would do that?" Hayakawa taught me to distinguish the concrete and the abstract; and from that small passage in Asprin, I picked up the dreadful personal habit of calling people's bluffs, often using the specific phrase, "Name three."
But the real subject of today's lesson is how to see skills like this on the 5-second level. And now that we have a specific example in hand, we can proceed to try to zoom in on the level of cognitive events that happen in 5 seconds or less.
View more: Next