A Call for Constant Vigilance

26 katydee 03 April 2013 09:52AM

Related to: What Do We Mean By "Rationality?"

Rationality has many facets, both relatively simple and quite complex. As a result, it can often be hard to determine what aspects of rationality you should or shouldn't stress.

An extremely basic and abstract model of how rationality works might look a little something like this:

  1. Collect evidence about your environment from various sources
  2. Update your model of reality based on evidence collected (optimizing the updating process is more or less what we know as epistemic rationality)
  3. Act in accordance with what your model of reality indicates is best for achieving your goals (optimizing the actions you take is more or less what we know as instrumental rationality)
  4. Repeat continually forever
A lot of thought, both on LessWrong and within the academic literature on heuristics and biases, has gone into improving epistemic rationality, and while improving instrumental rationality was less of a focus at first, recently the community has been focusing more on it. On the other hand, improving your ability to collect evidence has been relatively neglected-- hence the (in-progress as of this writing) Situational Awareness sequence.

But most neglected of all has been the last step, "repeat continually forever." This sounds like a trivial instruction but is in fact highly important to emphasize. All your skills and training and techniques mean nothing if you don't use them, and unfortunately there are many reasons that you might not use your skills.

You might be offended, angry, hurt, or otherwise emotionally compromised. Similarly, you might be sleepy, inebriated, hungry, or otherwise physically compromised. You might be overconfident in your ability to handle a certain type of problem or situation, and hence not bother to think of other ways that might work better.[1] You might simply not bother to apply your skills because you don't think they're necessary, missing out on potential gains that you don't see at a glance-- or maybe even don't know exist. All in all, there are many times in which you may be missing out on the benefits that your skills can provide.

It may therefore be worthwhile to occasionally check whether or not you are actually applying your skills. Further, try to make this sort of check a habit, especially when encountering circumstances where people would typically be less than rational. If you find that you aren't using your skills as often as you'd expect, that may be cause for alarm, and at the very least is cause for introspection. After all, if rationality skills can be constantly applied to being successful in everyday life, we should be constantly on the watch for opportunities to apply them, as well as for potential lapses in our vigilance.

I indeed suspect that most LessWrong users would benefit more from being more vigilant in practicing and applying basic rationality skills than they would from learning cool advanced techniques. This principle is generally true in the martial arts, and both the inside and outside view strongly suggest to me that it is true for the art of rationality as well.

All in all, improving your rationality is a matter of serious practice and changing your mindset, not just learning cool new life hacks-- so next time you think about improving your rationality, don't look for new tricks, but new ways to truly integrate the principles you are already familiar with.


[1] The footnote to Humans Are Not Automatically Strategic describes several examples where this might apply.

How to Not Get Offended

11 katydee 23 March 2013 11:12PM

Followup to: Don't Get Offended

Draws heavily on: Stoicism, Keep Your Identity Small, Living Luminously

Previously, we discussed why not getting offended might be an effective strategy to adopt in order to increase one's practical epistemic rationality. That's all well and good, but just as knowing about biases isn't the same as protecting ourselves from them, the simple desire to avoid being offended is (usually) insufficient to actually avoid it-- practice, too, is required.

So what should you actually practice if you find yourself becoming offended and want to stop? This post aims to address that. In doing so, it also features an expanded discussion of one question that seemed to be a sticking point for several posters in the previous discussion-- if you aren't getting offended, how will you discourage offensive and inappropriate behaviors?

Preparation

First, you need to really truly recognize that experiencing the feeling of being offended is an undesirable process. You must see why experiencing offense runs counter to knowing the truth

A good litmus test is to check whether experiencing the feeling of being offended seems obviously bad to you-- not the existence of the feeling itself or any behaviors tied to it, but the fact that you are experiencing it. It is important to understand that this refers only to the mental experience of being offended-- this post focuses entirely on the A (Affect) component of Alicorn's ABC model

While it might sound silly to have the preliminary step be simply thinking that being offended is bad, if you don't think that there's not much point in practicing the remaining steps. In fact, if you don't think that, practicing the remaining steps may be harmful.

Part One: Detection

In order to stop being offended-- or really alter nearly anything about your mental state-- the first step is to increase your awareness of when you are becoming offended and what that process looks like in as early a stage as possible. As in the case of ugh fields, being mindful of your reactions and "watching for the flinch" is an important early step.

As soon as you feel yourself becoming offended, you should notice this. It is then critical to truly inspect your reactions and determine why you are becoming offended. This doesn't mean thinking things like "I was offended because she insulted my friend," which has insufficient detail. Try for something more like "I was offended because she made a severe criticism of another person in the group and I feel that she did not have the relevant social capital to justify making her statement." If you don't have a detailed conception of exactly what it is that is offending you, moving forward will be difficult.

At times you will not be able to do this thanks to the heat of the moment. That's okay and in point of fact it is expected-- truly understanding one's own motivations and responses can be difficult even in unemotional situations. If necessary, wait for calmer times to evaluate such issues or ask others for clarifications or predictions. While the inputs of others might not always be useful, close friends (or unusually perceptive unclose friends) can in many cases pinpoint causes to your behavior that you might be blind to.

If emotionally possible, testing these models is certainly helpful, though I recognize that this can be challenging at times and do not recommend it to the unprepared. In particular, having your friends try to offend you to test your reactions is often a poor idea, as the emotional responses involved can be unpleasant for multiple parties.

Part Two: Dissolution

Once you have the ability to detect when and why you are becoming offended, there are multiple steps that one can take. The two techniques that have been most successful for me in the moment are what I like to call Dissolution and Defense.[1]

The first of those two methods, Dissolution, is what I tend to use under normal circumstances. This method attempts to dissolve feelings of offense by simply understanding them really well and then applying the Principle of Gendlin. For instance, if someone has said an insulting remark to me, I might think to myself "If this criticism is false, then it can easily be defeated by the truth. If this criticism is true, well, you know what P.C. Hodgell says about that... perhaps this criticism was not made in the most optimal manner, but I have no need to be offended, for the criticism will succeed or fail on the basis of the truth, not on the basis of whether it is appropriate."

For me, Gendlin is a true friend and can resolve most of these issues fairly trivially. However, this does not work the same for all individuals. Other techniques, such as perspective shifting, may be more reliable for others. The important strand that I have found throughout many people who can avoid being offended is the concept that being offended is a matter of one's own reaction, not the external world. I irreverently refer to this as the Principle of Hamlet-- there is nothing either good or bad, but thinking makes it so. It is a key tenet of Stoic thought.

Note that there are a few other things to consider. For instance, one should beware sticky brains when executing this technique. Personally, my brain isn't very sticky, but if yours is you may have to plan around that. There are many considerations similar to this one regarding personal mental styles, and the topic of "Things You Should Know About Your Brain" probably merits a post of its own, but I don't really have space to go into it here.[2] Suffice it to say that nothing presented here is set in stone and you should make whatever modifications are appropriate in order to fit this into your own mental style.

With practice, I have found that the sentiment expressed in this comment can apply to reactions as well as to personal traits-- reactions that I don't like having tend to go away soon after I understand them, since I can then apply these methods to their dissolution.

Part Three: Defense

However, I've found that there are some times in which I am unable to successfully dissolve my feelings of offense. It may be that I am extremely hungry or tired or otherwise impaired and thus have less than normal ability to control my reactions, or that I am simply too shocked to react normally. In this situation, I resort to the secondary method, Defense. This is not glamorous and not cool but it does work. The key to defense is isolating yourself from stimuli that produce undesirable results.

There are multiple forms of this-- the most basic one is simply leaving the area. Other simple methods include drowning something out (simple technique: the classic "I'M NOT LISTENING LA LA LA LA LA," except inside your head), suddenly becoming very (authentically) interested in something else, pretending you have to take a call, etc. One extremely important note is that these methods should be a last resort. Otherwise, it has the potential of becoming an excuse. I seriously considered not putting them in the post at all because of the risk of it making people not take the first method seriously enough. Ultimately, I decided that it would be better for most people to know than to not know-- but seriously, be careful with this.

If you do find yourself having to resort to these methods more often than you would like, there is another option-- active defense. I generally prefer action to reaction, so I tend to prefer active defenses to reactive ones. Active defenses involve self-modification so that certain stimuli no longer produce undesirable results or produce less undesirable results.

For instance, if I know that I am going to encounter someone who may make offensive remarks regarding another of my friends, I may steel myself for this prior to the encounter, saying to myself "While it may be that some of my friends dislike each other and want to express this to me, I should not fall into the trap of becoming offended and getting into an argument over whether or not one's criticism of the other is valid. All my friends don't have to be friends with one another, and trying to enforce this will only add to the trouble. Instead I will make a mild remark and move on."

This is an especially effective method when it comes to preventing surprised or shocked offended reactions, though of course one must always beware unknown unknowns. Marcus Aurelius engaged in an extremely general form of this, advocating that one begin each day by preparing oneself to meet with all sorts of offenses while avoiding anger or irritation. In some respects, the overall practice of Stoicism could be considered an advanced form of active defense-- though not just against becoming offended, but against all wild or uncontrolled reactions.

Part Four: Discouragement

This step is where you evaluate whether taking action to prevent further offensive behavior is merited or useful. As stated earlier, I believe that even in cases when it is instrumentally useful to show offense, one can still perform actions indicating offense without actually experiencing the internal state of being offended. The question then becomes when it is appropriate to do so.

In previous discussion, Oligopsony pointed out that taking offense at inappropriate behaviors can be considered a public good. I disagree to an extent, because I think that in many situations claiming to be offended or acting offended can in fact escalate a situation that would otherwise pass with a small amount of awkwardness and concern. However, simply allowing (truly) inappropriate behavior to continue without objection tacitly indicates that that behavior is acceptable and thus carries negative consequences of its own.

Overall, I find that generally speaking it is often wise to complain about offensive behaviors if you think it is likely that those behaviors will offend others. You should be wary about generalizing from one example, though. I find the sound of silverware contacting teeth to be both off-putting and offensive, but this is not something that I bother to point out with people that I don't expect to interact with often, since I am moderately confident that it is a pet peeve that most people don't care about and aren't offended by. On the other hand, I do bother to point out that fact to people that I expect to interact with frequently if I notice them doing it, since in this case it is worth my time to potentially avert a future instance.

A friend of mine who is currently commissioned as a military officer says that one key principle of effective leadership is "praise in public, punish in private--" in other words, save criticisms for private encounters so that you don't have to worry about potential status implications of making the criticism around others. In my experience, this is also an effective way to deal with offensive behavior while minimizing social awkwardness and the potential for escalation.

In some situations, though, it is simply necessary to stand up when no one else is willing and confront offensive behavior directly. I have done so several times and will say that while it is usually uncomfortable for all those concerned, the result can be worth it. That being said, I urge you to use extreme caution when evaluating whether or not it is necessary to do so. My impression is that many situations that people deem worthy of confrontation could be resolved more effectively through less direct means.

Part Five: CONSTANT VIGILANCE

As a final thought, I've seen a lot of people, thinking they've eradicated some bad habit, fall back into it, as they now consider themselves "safe." When installing epistemic habits, this risk is especially worrisome, since you may not notice that you have lapsed, in which case you can become the highly annoying sort of person who is weak in domains that they specifically consider themselves strong in and thus find themselves resistant to correction.

I must emphasize that I would much rather deal with someone who is offended and knows it than someone who is offended and thinks that he cannot possibly be so. So if you do wish to become a person who does not get offended, do it right. After all, it is dangerous to be half a rationalist

[1] This isn't to say that those techniques will necessarily be the most useful for you-- merely that I have found them successful and consider myself sufficiently qualified to explain them. It might be that alternative strategies could be more useful for you-- if so, feel free to post them in the comments, as they could potentially make this post that much more useful for future readers.

[2] If anyone wants to take the helm and write this post, they have my blessing-- my queue is overflowing right now. Please do send me a link if you do end up doing this, though.

Don't Get Offended

32 katydee 07 March 2013 02:11AM

Related to: Politics is the Mind-KillerKeep Your Identity Small

Followed By: How to Not Get Offended

One oft-underestimated threat to epistemic rationality is getting offended. While getting offended by something sometimes feels good and can help you assert moral superiority, in most cases it doesn't help you figure out what the world looks like. In fact, getting offended usually makes it harder to figure out what the world looks like, since it means you won't be evaluating evidence very well. In Politics is the Mind-Killer, Eliezer writes that "people who would be level-headed about evenhandedly weighing all sides of an issue in their professional life as scientists, can suddenly turn into slogan-chanting zombies when there's a Blue or Green position on an issue." Don't let yourself become one of those zombies-- all of your skills, training, and useful habits can be shut down when your brain kicks into offended mode!

One might point out that getting offended is a two-way street and that it might be more appropriate to make a post called "Don't Be Offensive." That feels like a just thing to say-- as if you are targeting the aggressor rather than the victim. And on a certain level, it's true-- you shouldn't try to offend people, and if you do in the course of a normal conversation it's probably your fault. But you can't always rely on others around you being able to avoid doing this. After all, what's offensive to one person may not be so to another, and they may end up offending you by mistake. And even in those unpleasant cases when you are interacting with people who are deliberately trying to offend you, isn't staying calm desirable anyway?

The other problem I have with the concept of being offended as victimization is that, when you find yourself getting offended, you may be a victim, but you're being victimized by yourself. Again, that's not to say that offending people on purpose is acceptable-- it obviously isn't. But you're the one who gets to decide whether or not to be offended by something. If you find yourself getting offended to things as an automatic reaction, you should seriously evaluate why that is your response.

There is nothing inherent in a set of words that makes them offensive or inoffensive-- your reaction is an internal, personal process. I've seen some people stay cool in the face of others literally screaming racial slurs in their faces and I've seen other people get offended by the slightest implication or slip of the tongue. What type of reaction you have is largely up to you, and if you don't like your current reactions you can train better ones-- this is a core principle of the extremely useful philosophy known as Stoicism.

Of course, one (perhaps Robin Hanson) might also point out that getting offended can be socially useful. While true-- quickly responding in an offended fashion can be a strong signal of your commitment to group identity and values[1]-- that doesn't really relate to what this post is talking about. This post is talking about the best way to acquire correct beliefs, not the best way to manipulate people. And while getting offended can be a very effective way to manipulate people-- and hence a tactic that is unfortunately often reinforced-- it is usually actively detrimental for acquiring correct beliefs. Besides, the signalling value of offense should be no excuse for not knowing how not to be offended. After all, if you find it socially necessary to pretend that you are offended, doing so is not exactly difficult.

Personally, I have found that the cognitive effort required to build a habit of not getting offended pays immense dividends. Getting offended tends to shut down other mental processes and constrain you in ways that are often undesirable. In many situations, misunderstandings and arguments can be diminished or avoided completely if one is unwilling to become offended and practiced in the art of avoiding offense. Further, some of those situations are ones in which thinking clearly is very important indeed! All in all, while getting offended does often feel good (in a certain crude way), it is a reaction that I have no regrets about relinquishing.

 

[1] In Keep Your Identity Small, Paul Graham rightly points out that one way to prevent yourself from getting offended is to let as few things into your identity as possible.

Memetic Tribalism

43 [deleted] 14 February 2013 03:03AM

Related: politics is the mind killer, other optimizing

When someone says something stupid, I get an urge to correct them. Based on the stories I hear from others, I'm not the only one.

For example, some of my friends are into this rationality thing, and they've learned about all these biases and correct ways to get things done. Naturally, they get irritated with people who haven't learned this stuff. They complain about how their family members or coworkers aren't rational, and they ask what is the best way to correct them.

I could get into the details of the optimal set of arguments to turn someone into a rationalist, or I could go a bit meta and ask: "Why would you want to do that?"

Why should you spend your time correcting someone else's reasoning?

One reason that comes up is that it's valuable for some reason to change their reasoning. OK, when is it possible?

  1. You actually know better than them.

  2. You know how to patch their reasoning.

  3. They will be receptive to said patching.

  4. They will actually change their behavior if the accept the patch.

It seems like it should be rather rare for those conditions to all be true, or even to be likely enough for the expected gain to be worth the cost, and yet I feel the urge quite often. And I'm not thinking it through and deciding, I'm just feeling an urge; humans are adaptation executors, and this one seems like an adaptation. For some reason "correcting" people's reasoning was important enough in the ancestral environment to be special-cased in motivation hardware.

I could try to spin an ev-psych just-so story about tribal status, intellectual dominance hierarchies, ingroup-outgroup signaling, and whatnot, but I'm not an evolutionary psychologist, so I wouldn't actually know what I was doing, and the details don't matter anyway. What matters is that this urge seems to be hardware, and it probably has nothing to do with actual truth or your strategic concerns.

It seems to happen to everyone who has ideas. Social justice types get frustrated with people who seem unable to acknowledge their own privilege. The epistemological flamewar between atheists and theists rages continually across the internet. Tech-savvy folk get frustrated with others' total inability to explore and use Google. Some aspiring rationalists get annoyed with people who refuse to decompartmentalize or claim that something is in a separate magisteria.

Some of those border on being just classic blue vs green thinking, but from the outside, the rationality example isn't all that different. They all seem to be motivated mostly by "This person fails to display the complex habits of thought that I think are fashionable; I should {make fun | correct them | call them out}."

I'm now quite skeptical that my urge to correct reflects an actual opportunity to win by improving someone's thinking, given that I'd feel it whether or not I could actually help, and that it seems to be caused by something else.

The value of attempting a rationality-intervention has gone back down towards baseline, but it's not obvious that the baseline value of rationality interventions is all that low. Maybe it's a good idea, even if there is a possible bias supporting it. We can't win just by reversing our biases; reversed stupidity is not intelligence.

The best reason I can think of to correct flawed thinking is if your ability to accomplish your goals directly depends on their rationality. Maybe they are your business partner, or your spouse. Someone specific and close who you can cooperate with a lot. If this is the case, it's near the same level of urgency as correcting your own.

Another good reason (to discuss the subject at least) is that discussing your ideas with smart people is a good way to make your ideas better. I often get my dad to poke holes in my current craziness, because he is smarter and wiser than me. If this is your angle, keep in mind that if you expect someone else to correct you, it's probably not best to go in making bold claims and implicitly claiming intellectual dominance.

An OK reason is that creating more rationalists is valuable in general. This one is less good than it first appears. Do you really think your comparative advantage right now is in converting this person to your way of thinking? Is that really worth the risk of social friction and expenditure of time and mental energy? Is this the best method you can think of for creating more rationalists?

I think it is valuable to raise the sanity waterline when you can, but using methods of mass instruction like writing blog posts, administering a meetup, or launching a whole rationality movement is a lot more effective than arguing with your mom. Those options aren't for everybody of course, but if you're into waterline-manipulation, you should at least be considering strategies like them. At least consider picking a better time.

Another reason that gets brought up is that turning people around you into rationalists is instrumental in a selfish way, because it makes life easier for you. This one is suspect to me, even without the incentive to rationalize. Did you also seriously consider sabotaging people's rationality to take advantage of them? Surely that's nearly as plausible a-priori. For what specific reason did your search process rank cooperation over predation? 

I'm sure there are plenty of good reasons to prefer cooperation, but of course no search process was ever run. All of these reasons that come to mind when I think of why I might want to fix someone's reasoning are just post-hoc rationalizations of an automatic behavior. The true chain of cause-and-effect is observe->feel->act; no planning or thinking involved, except where it is necessary for the act. And that feeling isn't specific to rationality, it affects all mental habits, even stupid ones.

Rationality isn't just a new memetic orthodoxy for the cool kids, it's about actually winning. Every improvement requires a change. Rationalizing strategic reasons for instinctual behavior isn't change, it's spending your resources answering questions with zero value of information. Rationality isn't about what other people are doing wrong; it's about what you are doing wrong.

I used to call this practice of modeling other people's thoughts to enforce orthodoxy on them "incorrect use of empathy", but in terms of ev-psych, it may be exactly the correct use of empathy. We can call it Memetic Tribalism instead.

(I've ignored the other reason to correct people's reasoning, which is that it's fun and status-increasing. When I reflect on my reasons for writing posts like this, it turns out I do it largely for the fun and internet status points, but I try to at least be aware of that.)

Right for the Wrong Reasons

14 katydee 24 January 2013 12:02AM

One of the few things that I really appreciate having encountered during my study of philosophy is the Gettier problem. Paper after paper has been published on this subject, starting with Gettier's original "Is Justified True Belief Knowledge?" In brief, Gettier argues that knowledge cannot be defined as "justified true belief" because there are cases when people have a justified true belief, but their belief is justified for the wrong reasons.

For instance, Gettier cites the example of two men, Smith and Jones, who are applying for a job. Smith believes that Jones will get the job, because the president of the company told him that Jones would be hired. He also believes that Jones has ten coins in his pocket, because he counted the coins in Jones's pocket ten minutes ago (Gettier does not explain this behavior). Thus, he forms the belief "the person who will get the job has ten coins in his pocket."

Unbeknownst to Smith, though, he himself will get the job, and further he himself has ten coins in his pocket that he was not aware of-- perhaps he put someone else's jacket on by mistake. As a result, Smith's belief that "the person who will get the job has ten coins in his pocket" was correct, but only by luck.

While I don't find the primary purpose of Gettier's argument particularly interesting or meaningful (much less the debate it spawned), I do think Gettier's paper does a very good job of illustrating the situation that I refer to as "being right for the wrong reasons." This situation has important implications for prediction-making and hence for the art of rationality as a whole.

Simply put, a prediction that is right for the wrong reasons isn't actually right from an epistemic perspective.

If I predict, for instance, that I will win a 15-touch fencing bout, implicitly believing this will occur when I strike my opponent 15 times before he strikes me 15 times, and I in fact lose fourteen touches in a row, only to win by forfeit when my opponent intentionally strikes me many times in the final touch and is disqualified for brutality, my prediction cannot be said to have been accurate.

Where this gets more complicated is with predictions that are right for the wrong reasons, but the right reasons still apply. Imagine the previous example of a fencing bout, except this time I score 14 touches in a row and then win by forfeit when my opponent flings his mask across the hall in frustration and is disqualified for an offense against sportsmanship. Technically, my prediction is again right for the wrong reasons-- my victory was not thanks to scoring 15 touches, but thanks to my opponent's poor sportsmanship and subsequent disqualification. However, I likely would have scored 15 touches given the opportunity.

In cases like this, it may seem appealing to credit my prediction as successful, as it would be successful under normal conditions. However, I  we have to resist this impulse and instead simply work on making more precise predictions. If we start crediting predictions that are right for the wrong reasons, even if it seems like the "spirit" of the prediction is right, this seems to open the door for relying on intuition and falling into the traps that contaminate much of modern philosophy.

What we really need to do in such cases seems to be to break down our claims into more specific predictions, splitting them into multiple sub-predictions if necessary. My prediction about the outcome of the fencing bout could better be expressed as multiple predictions, for instance "I will score more points than my opponent" and "I will win the bout." Some may notice that this is similar to the implicit justification being made in the original prediction. This is fitting-- drawing out such implicit details is key to making accurate predictions. In fact, this example itself was improved by tabooing[1] "better" in the vague initial sentence "I will fence better than my opponent."

In order to make better predictions, we must cast out those predictions that are right for the wrong reasons. While it may be tempting to award such efforts partial credit, this flies against the spirit of the truth. The true skill of cartography requires forming both accurate and reproducible maps; lucking into accuracy may be nice, but it speaks ill of the reproducibility of your methods.

 

[1] I greatly suggest that you make tabooing a five-second skill, and better still recognizing when you need to apply it to your own processes. It pays great dividends in terms of precise thought.

Don't Build Fallout Shelters

26 katydee 07 January 2013 02:38PM

Related: Circular Altruism

One thing that many people misunderstand is the concept of personal versus societal safety. These concepts are often conflated despite the appropriate mindsets being quite different.

Simply put, personal safety is personal.

In other words, the appropriate actions to take for personal safety are whichever actions reduce your chance of being injured or killed within reasonable cost boundaries. These actions are largely based on situational factors because the elements of risk that two given people experience may be wildly disparate.

For instance, if you are currently a young computer programmer living in a typical American city, you may want to look at eating better, driving your car less often, and giving up unhealthy habits like smoking. However, if you are currently an infantryman about to deploy to Afghanistan, you may want to look at improving your reaction time, training your situational awareness, and practicing rifle shooting under stressful conditions.

One common mistake is to attempt to preserve personal safety for extreme circumstances such as nuclear wars. Some individuals invest sizeable amounts of money into fallout shelters, years worth of emergency supplies, etc.

While it is certainly true that a nuclear war would kill or severely disrupt you if it occurred, this is not necessarily a fully convincing argument in favor of building a fallout shelter. One has to consider the cost of building a fallout shelter, the chance that your fallout shelter will actually save you in the event of a nuclear war, and the odds of a nuclear war actually occurring.

Further, one must consider the quality of life reduction that one would likely experience in a post-nuclear war world. It's also important to remember that, in the long run, your survival is contingent on access to medicine and scientific progress. Future medical advances may even extend your lifespan very dramatically, and potentially provide very large amounts of utility. Unfortunately, full-scale nuclear war is very likely to impair medicine and science for quite some time, perhaps permanently.

Thus even if your fallout shelter succeeds, you will likely live a shorter and less pleasant life than you would otherwise. In the end, building a fallout shelter looks like an unwise investment unless you are extremely confident that a nuclear war will occur shortly-- and if you are, I want to see your data!

When taking personal precautionary measures, worrying about such catastrophes is generally silly, especially given the risks we all take on a regular basis-- risks that, in most cases, are much easier to avoid than nuclear wars. Societal disasters are generally extremely expensive for the individual to protect against, and carry a large amount of disutility even if protections succeed.

To make matters worse, if there's a nuclear war tomorrow and your house is hit directly, you'll be just as dead as if you fall off your bike and break your neck. Dying in a more dramatic fashion does not, generally speaking, produce more disutility than dying in a mundane fashion does. In other words, when optimizing for personal safety, focus on accidents, not nuclear wars; buy a bike helmet, not a fallout shelter.

The flip side to this, of course, is that if there is a full-scale nuclear war, hundreds of millions-- if not billions-- of people will die and society will be permanently disrupted. If you die in a bike accident tomorrow, perhaps a half dozen people will be killed at most. So when we focus on non-selfish actions, the big picture is far, far, far more important. If you can reduce the odds of a nuclear war by one one-thousandth of one percent, more lives will be saved on average than if you can prevent hundreds of fatal accidents.

When optimizing for overall safety, focus on the biggest possible threats that you can have an impact on. In other words, when dealing with societal-level risks, your projected impact will be much higher if you try to focus on protecting society instead of protecting yourself.

In the end, building fallout shelters is probably silly, but attempting to reduce the risk of nuclear war sure as hell isn't. And if you do end up worrying about whether a nuclear war is about to happen, remember that if you can reduce the risk of said war-- which might be as easy as making a movie-- your actions will have a much, much greater overall impact than building a shelter ever could.

Rationality Quotes January 2013

6 katydee 02 January 2013 05:23PM

Happy New Year! Here's the latest and greatest installment of rationality quotes. Remember:

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself
  • Do not quote comments/posts on LessWrong or Overcoming Bias
  • No more than 5 quotes per person per monthly thread, please

Philosophy Needs to Trust Your Rationality Even Though It Shouldn't

27 lukeprog 29 November 2012 09:00PM

Part of the sequence: Rationality and Philosophy

Philosophy is notable for the extent to which disagreements with respect to even those most basic questions persist among its most able practitioners, despite the fact that the arguments thought relevant to the disputed questions are typically well-known to all parties to the dispute.

Thomas Kelly

The goal of philosophy is to uncover certain truths... [But] philosophy continually leads experts with the highest degree of epistemic virtue, doing the very best they can, to accept a wide array of incompatible doctrines. Therefore, philosophy is an unreliable instrument for finding truth. A person who enters the field is highly unlikely to arrive at true answers to philosophical questions.

Jason Brennan

 

After millennia of debate, philosophers remain heavily divided on many core issues. According to the largest-ever survey of philosophers, they're split 25-24-18 on deontology / consequentialism / virtue ethics, 35-27 on empiricism vs. rationalism, and 57-27 on physicalism vs. non-physicalism.

Sometimes, they are even divided on psychological questions that psychologists have already answered: Philosophers are split evenly on the question of whether it's possible to make a moral judgment without being motivated to abide by that judgment, even though we already know that this is possible for some people with damage to their brain's reward system, for example many Parkinson's patients, and patients with damage to the ventromedial frontal cortex (Schroeder et al. 2012).1

Why are physicists, biologists, and psychologists more prone to reach consensus than philosophers?2 One standard story is that "the method of science is to amass such an enormous mountain of evidence that... scientists cannot ignore it." Hence, religionists might still argue that Earth is flat or that evolutionary theory and the Big Bang theory are "lies from the pit of hell," and philosophers might still be divided about whether somebody can make a moral judgment they aren't themselves motivated by, but scientists have reached consensus about such things.

continue reading »

Nov 16-18: Rationality for Entrepreneurs

25 AnnaSalamon 08 November 2012 06:15PM

CFAR is taking LW-style rationality into the world, this month, with a new kind of rationality camp: Rationality for Entrepreneurs.  It is aimed at ambitious, relatively successful folk (regardless of whether they are familiar with LW), who like analytic thinking and care about making practical real-world projects work.  Some will be paying for themselves; others will be covered by their companies.  

If you'd like to learn rationality in a more practical context, consider applying.  Also, if you were hoping to introduce rationality and related ideas to a friend/acquaintance who fits the bill, please talk to them about the workshop, both for their sake and to strengthen the rationality community.

The price will be out of reach for some: the workshop costs $3.9k.  But there is a money-back guarantee.  Some partial scholarships may be available. This fee buys participants:

  • Four nights and three days at a retreat center, with small classes, interactive exercises, and much opportunity for unstructured conversation that applies the material at meals and during the evenings (room and board is included);
  • One instructor for every three participants; 
  • Six weeks of Skype/phone and email follow-up, to help participants make the material into regular habits, and navigate real-life business and personal situations with these tools.

CFAR is planning future camps which are more directly targeted at a Less Wrong audience (like our previous camps), so don’t worry if this camp doesn’t seem like the right fit for you (because of cost, interests, etc.).  There will be others.  But if you or someone you know does have an entrepreneurial bent[1], then we strongly recommend applying to this camp rather than waiting.  Attendees will be surrounded by other ambitious, successful, practically-minded folks, learn from materials that have been tailored to entrepreneurial issues, and receive extensive follow-up to help apply what they’ve learned to their businesses and personal lives.

Our schedule is below.

(See also the thread about the camp on Hacker News.)

continue reading »

Checklist of Rationality Habits

117 AnnaSalamon 07 November 2012 09:19PM
As you may know, the Center for Applied Rationality has run several workshops, each teaching content similar to that in the core sequences, but made more practical, and more into fine-grained habits.

Below is the checklist of rationality habits we have been using in the minicamps' opening session.  It was co-written by Eliezer, myself, and a number of others at CFAR.  As mentioned below, the goal is not to assess how "rational" you are, but, rather, to develop a personal shopping list of habits to consider developing.  We generated it by asking ourselves, not what rationality content it's useful to understand, but what rationality-related actions (or thinking habits) it's useful to actually do.

I hope you find it useful; I certainly have.  Comments and suggestions are most welcome; it remains a work in progress. (It's also available as a pdf.) 
continue reading »

View more: Prev | Next