Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Extending the stated objectives

6 Stuart_Armstrong 13 January 2016 04:20PM

A putative new idea for AI control; index here.

A system that is optimizing a function of n variables, where the objective depends on a subset of size k<n, will often set the remaining unconstrained variables to extreme values; if one of those unconstrained variables is actually something we care about, the solution found may be highly undesirable.

Stuart Russell

Think of an AI directing a car, given the instructions to get someone to the airport as fast as possible (optimised variables include "negative of time taken to airport") with some key variables left out - such as a maximum speed, maximum acceleration, respect for traffic rules, and survival of the passengers and other humans.

Call these other variables "unstated objectives" (UO), as contrasted with the "stated objectives" (SO) such as the time to the airport. In the normal environments in which we operate and design our AIs, the UOs are either correlated with the SOs (consider the SO "their heart is beating" and the UO "they're alive and healthy") or don't change much at all (the car-directing AI could have been trained on many examples of driving-to-the-airport, none of which included the driver killing their passengers).

Typically, SOs are easy to define, and the UOs are the more important objectives, left undefined either because they are complex, or because they didn't occur to us in this context (just as we don't often say "driver, get me to the airport as fast a possible, but alive and not permanently harmed, if you please. Also, please obey the following regulations and restrictions: 1.a.i.α: Non-destruction of the Earth....").

The control problem, in a nutshell, is that optimising SOs will typically set other variables to extreme values, including the UOs. The more extreme the optimisation, and the furthest from the typical environment, the more likely this is to happen.

continue reading »

Proper posture for mental arts

28 Valentine 31 August 2015 02:29AM

I'd like to start by way of analogy. I think it'll make the link to rationality easier to understand if I give context first.


I sometimes teach the martial art of aikido. The way I was originally taught, you had to learn how to "feel the flow of ki" (basically life energy) through you and from your opponent, and you had to make sure that your movements - both physical and mental - were such that your "ki" would blend with and guide the "ki" of your opponent. Even after I stopped believing in ki, though, there were some core elements of the art that I just couldn't do, let alone teach, without thinking and talking in terms of ki flow.

A great example of this is the "unbendable arm". This is a pretty critical thing to get right for most aikido techniques. And it feels really weird. Most people when they first get it think that the person trying to fold their arm isn't actually pushing because it doesn't feel like effort to keep their arm straight. Many students (including me once upon a time) end up taking this basic practice as compelling proof that ki is real. Even after I realized that ki wasn't real, I still had to teach unbendable arm this way because nothing else seemed to work.

…and then I found anatomical resources like Becoming a Supple Leopard.

It turns out that the unbendable arm works when:

That's it. If you do this correctly, you can relax most of your other arm muscles and still be able to resist pretty enormous force on your arm.

Why, you might ask? Well, from what I have gathered, this lets you engage your latissimus dorsi (pretty large back muscles) in stabilizing your elbow. There's also a bit of strategy where you don't actually have to fully oppose the arm-bender's strength; you just have to stabilize the elbow enough to be able to direct the push-down-on-elbow force into the push-up-on-wrist force.

But the point is, by understanding something about proper posture, you can cut literally months of training down to about ten minutes.


To oversimplify it a little bit, there are basically three things to get right about proper posture for martial arts (at least as I know them):

  1. You need to get your spine in the right position and brace it properly. (For the most part and for most people, this means tucking your pelvis, straightening your thoracic spine a bit, and tensing your abs a little.)
  2. You need to use your hip and shoulder ball-and-socket joints properly. (For the most part this seems to mean using them instead of your spine to move, and putting torque in them by e.g. screwing your elbow downward when reaching forward.)
  3. You need to keep your tissue supple & mobile. (E.g., tight hamstrings can pull your hips out of alignment and prevent you from using your hip joints instead of your mid-lumbar spine (i.e. waist) to bend over. Also, thoracic inflexibility usually locks people in thoracic kyphosis, making it extremely difficult to transfer force effectively between their lower body and their arms.)

My experience is that as people learn how to feel these three principles in their bodies, they're able to correct their physical postures whenever they need to, rather than having to wait for my seemingly magical touch to make an aikido technique suddenly really easy.

It's worth noting that this is mostly known, even in aikido dojos ("training halls"). They just phrase it differently and don't understand the mechanics of it. They'll say things like "Don't bend over; the other guy can pull you down if you do" and "Let the move be natural" and "Relax more; let ki flow through you freely."

But it turns out that getting the mechanical principles of posture down makes basically all the magic of aikido something even a beginner can learn how to see and correct.

A quick anecdote along these lines, which despite being illustrative, you should take as me being a bit of an idiot:

I once visited a dojo near the CFAR office. That night they were doing a practice basically consisting of holding your partner's elbow and pulling them to the ground. It works by a slight shift sideways to cause a curve in the lumbar spine, cutting power between their lower and upper bodies. Then you pull straight down and there's basically nothing they can do about it.

However, the lesson was in terms of feeling ki flow, and the instruction was to pull straight down. I was feeling trollish and a little annoyed about the wrongness and authoritarian delivery of the instruction, so I went to the instructor and asked: "Sensei, I see you pulling slightly sideways, and I had perhaps misheard the instructions to be that we should pull straight down. Should I be pulling slightly sideways too?"

At which point the sensei insisted that the verbal instructions were correct, concentrated on preventing the sideways shift in his movements, and obliterated his ability to demonstrate the technique for the rest of the night.


Brienne Yudkowsky has a lovely piece in which she refers to "mental postures". I highly recommend reading it. She does a better job of pointing at the thing than I think I would do here.

…but if you really don't want to read it just right now, here's the key element I'll be using: There seems to be a mental analog to physical posture.

We've had quite a bit of analogizing rationality as a martial art here. So, as a martial arts practitioner and instructor with a taste of the importance of deeply understanding body mechanics, I really want to ask: What, exactly, are the principles of good mental posture for the Art of Rationality?

In the way I'm thinking of it, this isn't likely to be things like "consider the opposite" or "hold off on proposing solutions". I refer to things of this breed as "mental movements" and think they're closer to the analogs of individual martial techniques than they are principles of mental orientation.

That said, we can look at mental movements to get a hint about what a good mental posture might do. In the body, good physical posture gives you both more power and more room for error: if you let your hands drift behind your head in a shihonage, having a flexible thoracic spine and torqued shoulders and braced abs can make it much harder for your opponent to throw you to the ground even though you've blundered. So, by way of analogy, what might an error in attempting to (say) consider the opposite look like, and what would a good "mental posture" be that would make the error matter less?

(I encourage you to think on your own about an answer for at least 60 seconds before corrupting your mind with my thoughts below. I really want a correct answer here, and I doubt I have one yet.)

When I think of how I've messed up in attempts to consider the opposite, I can remember several instances when my tone was dutiful. I felt like I was supposed to consider the opinion that I disagreed with or didn't want to have turn out to be true. And yet, it felt boring or like submitting or something like that to really take that perspective seriously. I felt like I was considering the opposite roughly the same way a young child replies to their parent saying "Now say that you're sorry" with an almost sarcastic "I'm sorry."

What kind of "mental posture" would have let me make this mistake and yet still complete the movement? Or better yet, what mental posture would have prevented the mistake entirely? At this point I intuit that I have an answer but it's a little tricky for me to articulate. I think there's a way I can hold my mind that makes the childish orientation to truth-seeking matter less. I don't do it automatically, much like most people don't automatically sit up straight, but I sort of know how to see my grasping at a conclusion as overreaching and then… pause and get my mental feet under my mental hips before I try again.

I imagine that wasn't helpful - but I think we have examples of good and bad mental posture in action. In attachment theory, I think that the secure attachment style is a description of someone who is using good mental posture even when in mentally/emotionally threatening situations, whereas the anxious and avoidant styles are descriptions of common ways people "tense up" when they lose good mental posture. I also think there's something interesting in how sometimes when I'm offended I get really upset or angry, and sometimes the same offense just feels like such a small thing - and sometimes I can make the latter happen intentionally.

The story I described above of the aikido sensei I trolled also highlights something that I think is important. In this case, although he didn't get very flustered, he couldn't change what he was doing. He seemed mentally inflexible, like the cognitive equivalent of someone who can't usefully block an overhead attack because of a stiff upper back restricting his shoulder movement. I feel like I've been in that state lots of times, so I feel like I can roughly imagine how my basic mental/emotional orientation to my situation and way of thinking would have to be in order to have been effective in his position right then - and why that can be tricky.

I don't feel like I've adequately answered the question of what good mental posture is yet. But I feel like I have some intuitions - sort of like being able to talk about proper posture in terms of "good ki flow". But I also notice that there seem to be direct analogs of the three core parts of good physical posture that I mentioned above:

  1. Have a well-braced "spine". Based on my current fledgling understanding, this seems to look something like taking a larger perspective, like imagining looking back at this moment 30 years hence and noticing what does and does not matter. (I think that's akin to tucking your hips, which is a movement in service of posture but isn't strictly part of the posture.) I imagine this is enormously easier when one has a well-internalized sense of something to protect.
  2. Move your mind in strong & stable ways, rather than losing "spine". I think this can look like "Don't act while triggered", but it's more a warning not to try to do heavy cognitive work while letting your mental "spine" "bend". Instead, move your mind in ways that you would upon reflection want your mind to move, and that you expect to be able to bear "weight".
  3. Make your mind flexible. Achieve & maintain full mental range of movement. Don't get "stiff", and view mental inflexibility as a risk to your mental health.

All three of these are a little hand-wavy. That third one in particular I haven't really talked about much - in part because I don't really know how to work on that well. I have some guesses, and I might write up some thoughts about that later. (A good solution in the body is called "mobilization", basically consisting of pushing on tender/stiff spots while you move the surrounding joints through their maximal range of motion.) Also, I don't know if there are more principles for the mind than these three, or if these three are drawing too strongly on the analogy and are actually a little distracting. I'm still at the stage where, for mental posture, I keep wanting to say the equivalent of "relax more and let ki flow."


A lot of people say I have excellent physical posture. I think I have a reasonably clear idea of how I made my posture a habit. I'd like to share that because I've been doing the equivalent in my mind for mental posture and am under the impression that it's getting promising results.

I think my physical practice comes down to three points:

  • Recognize that having good posture gives you superpowers. It's really hard to throw me down, and I can pretty effortlessly pull people to the ground. A lot of that is martial skill, but a huge chunk of it is just that good posture gives me excellent leverage. This transfers to being able to lift really heavy things and move across the room very efficiently and quickly when needed. This also gives me a pretty big leg up on learning physical skills. Recognizing that these were things I'd gain from learning good posture gave me a lot of drive to stick to my practice.
  • Focus on how the correct posture feels, and exactly how it's different from glitchy posture. I found it super-important to notice that my body feels different in specific ways when my shoulders are in the right position versus when they're too far forward or back. Verbal instructions like "Pull shoulders back" don't work nearly as well as the feeling in the body.
  • Choose one correction at a time, and always operate from that posture, pausing and correcting yourself when you're about to slip up. Getting good shoulder posture required that I keep my shoulders back all the time. When I would reach for water, I'd notice when my shoulder was in the too-far-forward position, and then pull back and fix my shoulder position before trying again. This sometimes required trying at very basic tasks several times, often quite slowly, until I could get it right each time.

Although I didn't add this until quite late, I would now add a fourth point when giving advice on getting good physical posture: make sure to mobilize the parts of your body that are either (a) preventing you from moving into a good position or (b) requiring you to be very stiff or tense to hold that position. The trouble is, I know how to do that for the body, but I'm not as sure about how to do that for the mind.

But the three bullet points above are instructions that I can follow with respect to mental posture, I think.

So, to the extent that that seems possible for you, I invite you to try to do the same - and let me know how it goes.

 

Applied Rationality Workshops: Jan 25-28 and March 1-4

20 AnnaSalamon 03 January 2013 01:00AM

The Center for Applied Rationality is running two more four-day workshops: Jan 25-28 and March 1-4 in the SF bay area.  Like the previous workshop, these sessions are targeted at ambitious, analytic people who have broad intellectual interests, and who care about making real-world projects work.  Less Wrong veterans and Less Wrong newcomers alike are welcome: as discussed below, we are intentionally bringing together folks with varied backgrounds and skill bases.

Workshop details:

continue reading »

Checklist of Rationality Habits

117 AnnaSalamon 07 November 2012 09:19PM
As you may know, the Center for Applied Rationality has run several workshops, each teaching content similar to that in the core sequences, but made more practical, and more into fine-grained habits.

Below is the checklist of rationality habits we have been using in the minicamps' opening session.  It was co-written by Eliezer, myself, and a number of others at CFAR.  As mentioned below, the goal is not to assess how "rational" you are, but, rather, to develop a personal shopping list of habits to consider developing.  We generated it by asking ourselves, not what rationality content it's useful to understand, but what rationality-related actions (or thinking habits) it's useful to actually do.

I hope you find it useful; I certainly have.  Comments and suggestions are most welcome; it remains a work in progress. (It's also available as a pdf.) 
continue reading »

To Learn Critical Thinking, Study Critical Thinking

26 gwern 07 July 2012 11:50PM

Critical thinking courses may increase students’ rationality, especially if they do argument mapping.

The following excerpts are from “Does philosophy improve critical thinking skills?”, Ortiz 2007.

1 Excerpts

This thesis makes a first attempt to subject the assumption that studying [Anglo-American analytic] philosophy improves critical thinking skills to rigorous investigation.

…Thus the second task, in Chapter 3, is to articulate and critically examine the standard arguments that are raised in support of the assumption (or rather, would be raised if philosophers were in the habit of providing support for the assumption). These arguments are found to be too weak to establish the truth of the assumption. The failure of the standard arguments leaves open the question of whether the assumption is in fact true. The thesis argues at this point that, since the assumption is making an empirical assertion, it should be investigated using standard empirical techniques as developed in the social sciences. In Chapter 4, I conduct an informal review of the empirical literature. The review finds that evidence from the existing empirical literature is inconclusive. Chapter 5 presents the empirical core of the thesis. I use the technique of meta-analysis to integrate data from a large number of empirical studies. This meta-analysis gives us the best yet fix on the extent to which critical thinking skills improve over a semester of studying philosophy, general university study, and studying critical thinking. The meta-analysis results indicate that students do improve while studying philosophy, and apparently more so than general university students, though we cannot be very confident that this difference is not just the result of random variation. More importantly, studying philosophy is less effective than studying critical thinking, regardless of whether one is being taught in a philosophy department or in some other department. Finally, studying philosophy is much less effective than studying critical thinking using techniques known to be particularly effective such as LAMP.

continue reading »

Minicamps on Rationality and Awesomeness: May 11-13, June 22-24, and July 21-28

24 AnnaSalamon 29 March 2012 08:48PM

I do not say this lightly... but if you're looking for superpowers, this is the place to start.”

--Michael Curzi, summer 2011 minicamp participant

Who: You and a class full of other aspiring rationalists and world-optimizers, from around the world.

What: Two 3-day weekend minicamps and one 8-day minicamp, filled with hands-on activities for applying rationality to your life, your goals, and the making of a better world.  (See details in the FAQ.)

When and where: We're running three camps, so that we can do this for three sets of participants: May 11-13 and June 22-24 for the 3-day camps, and July 21-28 for the eight-day camp, all in the San Francisco Bay Area.

Why: Because you’re a social primate, and the best way to jump into a new way of thinking, make friends, and accomplish your goals is often to spend time with other primates who are doing just that. 

Other reasons:

  • Hang out and explore the Bay Area with two dozen other people like you who are smart, interesting, and passionate about rationality
  • Attend bonus sessions about style, body language, and confidence-building.
  • Get help charting out career paths; and, entirely optionally for those interested, connect with folks at the Singularity Institute about optimal philanthropy.

Instructors:

Eliezer Yudkowsky Anna Salamon Julia Galef
Andrew Critch Luke Muehlhauser Michael Smith

Cost:  $650 for the three-day programs; $1500 for the week-long program.  This includes lodging[1], meals, and tuition.  

(Note that this *still* isn't quite enough to make running minicamps sustainable in the long-run; a lodging + meals at retreat centers start at around $90 per person per night, the "three-day camps" include four nights, and these workshops take a staff of about 5 full-time people for over a month each prior to each workshop, most of us at $3k/month, counting curriculum development time (plus miscellaneous expenses).  We are trying to strike a compromise between "charge enough that we can run more camps" and staying affordable, especially for our start-up phase; costs will probably go up in following years.)

Three days (or a week) isn’t long enough to learn rationality, but it's long enough to learn how to learn rationality, and to get some momentum toward doing so.

Come meet us, and see what you can do.

Apply now.

continue reading »

Biased Pandemic

56 freyley 13 March 2012 11:32PM

Recently, Portland Lesswrong played a game that was a perfect trifecta of: difficult mental exercise; fun; and an opportunity to learn about biases and recognize them in yourself and others. We're still perfecting it, and we'd welcome feedback, especially from people who try it.

The Short Version

The game is a combination of Pandemic, a cooperative board game that is cognitively demanding, and the idea of roleplaying cognitive biases. Our favorite way of playing it (so far), everyone selects a bias at random, and then attempts to exaggerate that bias in their arguments and decisions during the game. Everyone attempts to identify the biases in the other players, and, when a bias is guessed, the guessed player selects a new bias and begins again.

continue reading »

For-Profit Rationality Training

24 ksvanhorn 28 December 2011 09:42PM

As I've been reading through various articles and their comments on Less Wrong, I've noticed a theme that has appeared repeatedly: a frustration that we are not seeing more practical benefits from studying rationality. For example, Eliezer writes in A Sense that More Is Possible,

Why aren't "rationalists" surrounded by a visible aura of formidability? Why aren't they found at the top level of every elite selected on any basis that has anything to do with thought? Why do most "rationalists" just seem like ordinary people...

Yvain writes in Extreme Rationality: It's Not That Great,

...I've gotten countless clarity-of-mind benefits from Overcoming Bias' x-rationality, but practical benefits? Aside from some peripheral disciplines, I can't think of any.

patrissimo wrote in a comment on another article,

Sorry, folks, but compared to the self-help/self-development community, Less Wrong is currently UTTERLY LOSING at self-improvement and life optimization.

These writers have also offered some suggestions for improving the situation. Eliezer writes,

Of this [question] there are several answers; but one of them, surely, is that they have received less systematic training of rationality in a less systematic context than a first-dan black belt gets in hitting people.

patrissimo describes what he thinks an effective rationality practice would look like.

  1. It is a group of people who gather in person to train specific skills.
  2. While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
  3. Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure). As a result, it is driven by the needs of the learners [emphasis added].
  4. You have to sweat, but the result is you get stronger.
  5. You improve by learning from those better than you, competing with those at your level, and teaching those below you.
  6. It is run by a professional, or at least someone getting paid [emphasis added] for their hobby. The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.

Dan Nuffer and I have decided that it's time to stop talking and start doing. We are in the very early stages of creating a business to help people improve their lives by training them in instrumental rationality. We've done some preliminary market research to get an idea of where the opportunities might lie. In fact, this venture got started when, on a whim, I ran a poll on ask500people.com asking,

Would you pay $75 for an interactive online course teaching effective decision-making skills?

I got 299 responses in total. These are the numbers that responded with "likely" or "very likely":

  • 23.4% (62) overall.
  • 49% (49 of 100) of the respondents from India.
  • 10.6% (21 of 199) of the respondents not from India.
  • 9.0% (8 of 89) of the respondents from the U.S.

These numbers were much higher than I expected, especially the numbers from India, which still puzzle me. Googling around a bit, though, I found an instructor-led online decision-making course for $130, and a one-day decision-making workshop offered in the UK for £200 (over $350)... and the Google keyword tool returns a large number of search terms (800) related to "decision-making", many of them with a high number of monthly searches.

So it appears that there may be a market for training in effective decision-making -- something that could be the first step towards a more comprehensive training program in instrumental rationality. Some obvious market segments to consider are business decision makers, small business owners, and intelligent people of an analytical bent (e.g., the kind of people who find Less Wrong interesting). An important subset of this last group are INTJ personality types; I don't know if there is an effective way to find and market to specific Meyers-Briggs personality types, but I'm looking into it.

"Life coaching" is a proven business, and its growing popularity suggests the potential for a "decision coaching" service; in fact, helping people with big decisions is one of the things a life coach does. One life coach of 12 years described a typical client as age 35 to 55, who is "at a crossroads, must make a decision and is sick of choosing out of safety and fear." Life coaches working with individuals typically charge around $100 to $300 per hour. As far as I can tell, training in decision analysis / instrumental rationality is not commonly found among life coaches. Surely we can do better.

Can we do effective training online? patrissimo thinks that gathering in person is necessary, but I'm not so sure. His evidence is that "all the people who have replied to me so far saying they get useful rationality practice out of the LW community said the growth came through attending local meetups." To me this is weak evidence -- it seems to say more about the effectiveness of local meetups vs. just reading about rationality. In any event, it's worth testing whether online training can work, since

  • not everyone can go to meetups,
  • it should be easier to scale up, and
  • not to put too fine a point on it, but online training is probably more profitable.

To conclude, one of the things an entrepreneur needs to do is "get out of the building" and talk to members of the target market. We're interested in hearing what you think. What ideas do you think would be most effective in training for instrumental rationality, and why? What would you personally want from a rationality training program? What kinds of products / services related to rationality training would you be interesting in buying?

Teachable Rationality Skills

52 Eliezer_Yudkowsky 27 May 2011 09:57PM

Recent brainstorming sessions at SIAI (with participants including Anna, Carl, Jasen, Divia, Will, Amy Willey, and Andrew Critch) have started to produce lists of rationality skills that we could potentially try to teach (at Rationality Boot Camp, at Less Wrong meetups, or similar venues).  We've also been trying to break those skills down to the 5-second level (step 2) and come up with ideas for exercises that might teach them (step 3) although we haven't actually composed those exercises yet (step 4, where the actual work takes place).

The bulk of this post will mainly go into the comments, which I'll try to keep to the following format:  A top-level comment is a major or minor skill to teach; upvote this comment if you think this skill should get priority in teaching.  Sub-level comments describe 5-second subskills that go into this skill, and then third-level comments are ideas for exercises which could potentially train that 5-second skill.  If anyone actually went to the work of composing a specific exercise people could run through, that would go to the fourth-level of commenting, I guess.  For some major practicable arts with a known standard learning format like "Improv" or "Acting", I'll put the exercise at the top and guesses at which skills it might teach below.  (And any plain old replies can go at any level.)

I probably won't be able to get to all of what we brainstormed today, so here's a PNG of the Freemind map that I generated during our session.

Ace Attorney: pioneer Rationalism-didactic game?

19 Raw_Power 23 May 2011 11:28PM

This article aims to prove that Ace Attorney is possibly the first rationalist game in the lesswrongian sense, or at least a remarkable proto-example, and that it subliminally works to raise the sanity waterline in the general population, and might provide a template on which to base future works that aim to achieve a similar effect.

The Ace Attorney series of games for the Nintendo DS console puts you in the shoes of Phoenix Wright, an attorney who, in the vein of Perry Mason, takes on difficult cases to defend his clients from a judicial system that is heavily inspired by that of Japan, in which the odds are so stacked against the defense it's practically a Kangaroo Court where your clients are guilty until proven innocent.

For those unfamiliar with the game, and those who want to explore the "social criticism" aspect of the game, I wholeheartedly recommend this most excellent article from The Escapist. Now that that's out of the way, we can move on to what makes this relevant for Less Wrong. What makes this game uniquely interesting from a Rationalist POV is that the entire game mechanics are based on

  • gathering material evidence
  • finding the factual contradictions in the witnesses' testimonies
  • using the evidence to bust the lies open and force the truth out
continue reading »

View more: Next