Cat, who has volunteered extensively at CFAR (and taught at CFAR), will be visiting many cities in Europe over the coming months.
She is awesome.
Also, the list of cities that she is visiting will probably be determined in the next few days.
If you'd like to have her visit your LW meetup group, share some of our classes with your meetup, and generally bring connections back and forth... comment below, or PM her (or me)! I suspect this can be a lot of fun, and useful as well. Offers of couch space and similar are also appreciated.
For now this is just for Europe, probably, but no harm in touching base from other cities as well; it's possible she'll visit elsewhere later.
CFAR is hiring an additional logistics manager. Please click on our form for more information, or to fill out an application:
We hope to choose a candidate within the next week or so, so if you're interested, do apply ASAP.
The Center for Applied Rationality is running two more four-day workshops: Jan 25-28 and March 1-4 in the SF bay area. Like the previous workshop, these sessions are targeted at ambitious, analytic people who have broad intellectual interests, and who care about making real-world projects work. Less Wrong veterans and Less Wrong newcomers alike are welcome: as discussed below, we are intentionally bringing together folks with varied backgrounds and skill bases.
CFAR is taking LW-style rationality into the world, this month, with a new kind of rationality camp: Rationality for Entrepreneurs. It is aimed at ambitious, relatively successful folk (regardless of whether they are familiar with LW), who like analytic thinking and care about making practical real-world projects work. Some will be paying for themselves; others will be covered by their companies.
If you'd like to learn rationality in a more practical context, consider applying. Also, if you were hoping to introduce rationality and related ideas to a friend/acquaintance who fits the bill, please talk to them about the workshop, both for their sake and to strengthen the rationality community.
The price will be out of reach for some: the workshop costs $3.9k. But there is a money-back guarantee. Some partial scholarships may be available. This fee buys participants:
- Four nights and three days at a retreat center, with small classes, interactive exercises, and much opportunity for unstructured conversation that applies the material at meals and during the evenings (room and board is included);
- One instructor for every three participants;
- Six weeks of Skype/phone and email follow-up, to help participants make the material into regular habits, and navigate real-life business and personal situations with these tools.
CFAR is planning future camps which are more directly targeted at a Less Wrong audience (like our previous camps), so don’t worry if this camp doesn’t seem like the right fit for you (because of cost, interests, etc.). There will be others. But if you or someone you know does have an entrepreneurial bent, then we strongly recommend applying to this camp rather than waiting. Attendees will be surrounded by other ambitious, successful, practically-minded folks, learn from materials that have been tailored to entrepreneurial issues, and receive extensive follow-up to help apply what they’ve learned to their businesses and personal lives.
Our schedule is below.
(See also the thread about the camp on Hacker News.)
Are there any Singaporean LW-ers out there? I'll be visiting Singapore for a few days with my husband, Carl Shulman, and we'd be keen to either have a short meet-up in a coffee shop somewhere, or to see Singaporean sites while talking to a LWer or two. Please comment or pm me if you're interested. We get in noon this Thursday (tomorrow), and leave the morning of Sunday, the 26th.
Center for Modern Rationality currently hiring: Executive assistants, Teachers, Research assistants, Consultants.
We are still looking for:
A second executive assistant -- preferably someone who lives in the SF bay area or is willing to relocate here, but remote work will also be considered. Apply here.
Teachers / curriculum designers. This *does* need to be someone who can relocate to the SF bay area, and who has the legal ability to work in the US. Apply here. Especially apply if:
- Rationality, or similar changes in your skill set, have made a big difference in your life;
- You enjoy teaching, and helping others change their lives; you have strong interpersonal skills;
- You have exceptional analytic skills, and want to help us figure out what sort of "rationality" and "rationality training" can actually work -- by being skeptical, trying things out, measuring outcomes, etc.
Distant curriculum designers: as above, except that you don't need the interpersonal/teaching skills, and do need to be extra-exceptional in other respects. Apply here.
Programmers -- folks who can whip up simple prototype web apps quickly, to help with rationality training. Apply here.
Consultants -- folks who have relevant experience, and can spend a few hours offering suggestions for how to structure our workshops, or for how to structure rationality group more generally (after watching us teach, or by giving advice over the phone). If you've run successful workshops for adults before, of any sort (e.g., on italian cooking), consider applying to help us organize our program. Apply here.
If you live in the SF bay area, you are also very welcome to come on a Satuday and help us test out draft lessons (by being a participant as we present them): email stephenpcole at gmail dot com to be added to that email list.
Do err on the side of applying; hope to hear from you soon!
(These application forms take the place of the previous ones; but if you've applied with the previous one, you're still golden, I'm just a bit behind on processing the applications.)
“I do not say this lightly... but if you're looking for superpowers, this is the place to start.”
--Michael Curzi, summer 2011 minicamp participant
Who: You and a class full of other aspiring rationalists and world-optimizers, from around the world.
What: Two 3-day weekend minicamps and one 8-day minicamp, filled with hands-on activities for applying rationality to your life, your goals, and the making of a better world. (See details in the FAQ.)
When and where: We're running three camps, so that we can do this for three sets of participants: May 11-13 and June 22-24 for the 3-day camps, and July 21-28 for the eight-day camp, all in the San Francisco Bay Area.
Why: Because you’re a social primate, and the best way to jump into a new way of thinking, make friends, and accomplish your goals is often to spend time with other primates who are doing just that.
- Hang out and explore the Bay Area with two dozen other people like you who are smart, interesting, and passionate about rationality
- Attend bonus sessions about style, body language, and confidence-building.
- Get help charting out career paths; and, entirely optionally for those interested, connect with folks at the Singularity Institute about optimal philanthropy.
|Eliezer Yudkowsky||Anna Salamon||Julia Galef|
|Andrew Critch||Luke Muehlhauser||Michael Smith|
Cost: $650 for the three-day programs; $1500 for the week-long program. This includes lodging, meals, and tuition.
(Note that this *still* isn't quite enough to make running minicamps sustainable in the long-run; a lodging + meals at retreat centers start at around $90 per person per night, the "three-day camps" include four nights, and these workshops take a staff of about 5 full-time people for over a month each prior to each workshop, most of us at $3k/month, counting curriculum development time (plus miscellaneous expenses). We are trying to strike a compromise between "charge enough that we can run more camps" and staying affordable, especially for our start-up phase; costs will probably go up in following years.)
Three days (or a week) isn’t long enough to learn rationality, but it's long enough to learn how to learn rationality, and to get some momentum toward doing so.
Come meet us, and see what you can do.
How do you notice when you're rationalizing? Like, what *actually* tips you off, in real life?
I've listed my cues below; please add your own (one idea per comment), and upvote the comments that you either: (a) use; or (b) will now try using.
I'll be using this list in a trial rationality seminar on Wednesday; it also sounds useful in general.
Partially in response to: The curse of identity
Joe studies long hours, and often prides himself on how driven he is to make something of himself. But in the actual moments of his studying, Joe often looks out the window, doodles, or drags his eyes over the text while his mind wanders. Someone sent him a link to which college majors lead to the greatest lifetime earnings, and he didn't get around to reading that either. Shall we say that Joe doesn't really care about making something of himself?
The Inuit may not have 47 words for snow, but Less Wrongers do have at least two words for belief. We find it necessary to distinguish between:
- Anticipations, what we actually expect to see happen;
- Professed beliefs, the set of things we tell ourselves we “believe”, based partly on deliberate/verbal thought.
This distinction helps explain how an atheistic rationalist can still get spooked in a haunted house; how someone can “believe” they’re good at chess while avoiding games that might threaten that belief ; and why Eliezer had to actually crash a car before he viscerally understood what his physics books tried to tell him about stopping distance going up with the square of driving speed. (I helped Anna revise this - EY.)
A lot of our community technique goes into either (1) dealing with "beliefs" being an evolutionarily recent system, such that our "beliefs" often end up far screwier than our actual anticipations; or (2) trying to get our anticipations to align with more evidence-informed beliefs.
And analogously - this analogy is arguably obvious, but it's deep, useful, and easy to overlook in its implications - there seem to be two major kinds of wanting:
- Urges: concrete emotional pulls, produced in System 1's perceptual / autonomic processes
(my urge to drink the steaming hot cocoa in front of me; my urge to avoid embarrassment by having something to add to my accomplishments log)
- Goals: things we tell ourselves we’re aiming at, within deliberate/verbal thought and planning
(I have a goal to exercise three times a week; I have a goal to reduce existential risk)
Implication 1: You can import a lot of technique for "checking for screwy beliefs" into "checking for screwy goals".
View more: Next