Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Idea for LessWrong: Video Tutoring

10 adamzerner 23 June 2017 09:40PM

Idea: we coordinate to teach each other things via video chat.

  • We (mostly) all like learning. Whether it be for fun, curiosity, a stepping stone towards our goals.
  • My intuition is that there's a lot of us who also enjoy teaching. I do, personally.
  • Enjoyment aside, teaching is a good way of solidifying ones knowledge.
  • Perhaps there would be positive unintended consequences. Eg. socially.
  • Why video? a) I assume that medium is better for education than simply text. b) Social and motivational benefits, maybe. A downside to video is that some may find it intimidating.
  • It may be nice to evolve this into a group project where we iteratively figure out how to do a really good job teaching certain topics.
  • I see the main value in personalization, as opposed to passive lectures/seminars. Those already exist, and are plentiful for most topics. What isn't easily accessible is personalization. With that said, I figure it'd make sense to have about 5 learners per teacher.

So, this seems like something that would be mutually beneficial. To get started, we'd need:

  1. A place to do this. No problem: there's Hangouts, Skype, https://talky.io/, etc.
  2. To coordinate topics and times.

Personally, I'm not sure how much I can offer as far as doing the teaching. I worked as a web developer for 1.5 years and have been teaching myself computer science. I could be helpful to those unfamiliar with those fields, but probably not too much help for those already in the field and looking to grow. But I'm interested in learning about lots of things!

Perhaps a good place to start would be to record in some spreadsheet, a) people who want to teach, b) what topics, and c) who is interested in being a Learner. Getting more specific about who wants to learn what may be overkill, as we all seem to have roughly similar interests. Or maybe it isn't.

If you're interested in being a Learner or a Teacher, please add yourself to this spreadsheet.

Develop skills, or "dive in" and start a startup?

1 adamzerner 26 May 2017 06:07PM

Technical skills

There seems to be evidence that programmer productivity varies by at least an order of magnitude. My subjective sense is that I personally can become a lot more productive.

Conventional wisdom says that it's important to build and iterate quickly. Technical skills (amongst other things) are necessary if you want to build and iterate quickly. So then, it seems worthwhile to develop your technical skills before pursuing a startup. To what extent is this true?

Domain expertise

Furthermore, domain expertise seems to be important:

You want to know how to paint a perfect painting? It's easy. Make yourself perfect and then just paint naturally.

I've wondered about that passage since I read it in high school. I'm not sure how useful his advice is for painting specifically, but it fits this situation well. Empirically, the way to have good startup ideas is to become the sort of person who has them.

- http://www.paulgraham.com/startupideas.html

The second counterintuitive point is that it's not that important to know a lot about startups. The way to succeed in a startup is not to be an expert on startups, but to be an expert on your users and the problem you're solving for them.

- http://www.paulgraham.com/before.html

So one guaranteed way to turn your mind into the type that has good startup ideas is to get yourself to the leading edge of some technology—to cause yourself, as Paul Buchheit put it, to "live in the future."

- http://www.paulgraham.com/before.html

So then, if your goal is to start a successful startup, how much time should you spend developing some sort of domain expertise before diving in?

How I'd Introduce LessWrong to an Outsider

4 adamzerner 03 May 2017 04:32AM

Note/edit: I'm imagining explaining this to a friend or family member who is at least somewhat charitable and trusting of my judgement. I am not imagining simply putting this on the About page. I should have made this clear from the beginning - my bad. However, I do believe that some (but not all) of the design decisions would be effective on something like the About page as well.


There's this guy named Eliezer Yudkowsky. He's really, really smart. He founded MIRI, wrote a popular fanfic of Harry Potter that centers around rationality, and has a particularly strong background in AI, probability theory, and decision theory. There's another guy named Robin Hanson. Hanson is an economics professor at George Mason, and has a background in physics, AI and statistics. He's also really, really smart.

Yudkowsky and Hanson started a blog called Overcoming Bias in November of 2006. They blogged about rationality. Later on, Yudkowsky left Overcoming Bias and started his own blog - LessWrong.

What is rationality? Well, for starters, it's incredibly interdisciplinary. It involves academic fields like probability theory, decision theory, logic, evolutionary psychology, cognitive biases, lots of philosophy, and AI. The goal of rationality is to help you be right about the things you believe. In other words, the goal of rationality is to be wrong less often. To be LessWrong.

Weird? Useful?

LessWrong may seem fringe-y and cult-y, but the teachings are usually things that aren't controversial at all. Again, rationality teaches you things like probability theory and evolutionary psychology. Things that academics all agree on. Things that academics have studied pretty thoroughly. Sometimes the findings haven't made it to mainstream culture yet, but they're almost always things that the experts all agree on and consider to be pretty obvious. These aren't some weird nerds cooped up in their parents basement preaching crazy ideas they came up with. These are early adopters who are taking things that have already been discovered, bringing them together, and showing us how the findings could help us be wrong less frequently.

Rationalists tend to be a little "weird" though. And they tend to believe a lot of "weird" things. A lot of science-fiction-y things. They believe we're going to blend with robots and become transhumans soon. They believe that we may be able to freeze ourselves before we die, and then be revived by future generations. They believe that we may be able to upload our consciousness to a computer and live as a simulation. They believe that computers are going to become super powerful and completely take over the world.

Personally, I don't understand these things well enough to really speak to their plausibility. My impression so far is that rationalists have very good reasons for believing what they believe, and that they're probably right. But perhaps you don't share this impression. Perhaps you think those conclusions are wacky and ridiculous. Even if you think this, it's still possible that the techniques may be useful to you, right? It's possible that rationalists have misapplied the techniques in some ways, but that if you learn the techniques and add them to your arsenal, they'll help you level up. Consider this before writing rationality off as wacky.

Overview

So, what does rationality teach you? Here's my overview:

  • The difference between reality, and our models of reality (see map vs. territory).
  • That things are there components. Airplanes are made up of quarks. "Airplane" is a concept we created to model reality.
  • To think in gray. To say, "I sense that x is true" and "I'm pretty sure that x is true" instead of "X is true".
  • To update your beliefs incrementally. To say, "I still don't think X is true, but now that you've showed me Y, I'm somewhat less confident." On the other hand, a Black And White Thinker would say, "Eh, even though you showed me Y, I still just don't think X is true."
  • How much we should actually update our beliefs when we come across a new observation. A little? A lot? Bayes' theorem has the answers. It is a fundamental component of rationality.
  • That science, as an institution, prevents you from updating your beliefs quickly enough. Why? Because it requires a lot of good data before you're allowed to update your beliefs at all. Even just a little bit. Of course you shouldn't update too much with bad data, but you should still nudge your beliefs a bit in the direction that the data point toward.
  • To make your beliefs about things that are actually observable. Think: if a tree falls in a forest and no one hears it, does it make a sound? Adding this technique to your arsenal will help you make sense of a lot of philosophical dilemmas. 
  • To make decisions based on consequences. To distinguish between your end goal, and the stepping stones you must pass on your way there. People often forget what it is that they are actually pursuing, and get tricked into pursuing the stepping stones alone. Ex. getting too caught up moving up the career ladder.
  • How evolution really works, and how it helps explain why we are the way we are today. Hint: it's slow and stupid.
  • How quantum physics really works.
  • How words can be wrong.
  • Utilitarian ethics.
  • That you have A LOT of biases. And that by understanding them, you could side-step the pain that they would otherwise have caused you.
  • Similarly, that you have A LOT of "failure modes", and that by understanding them, you could side-step a lot of the pain that they would otherwise have caused you.
  • Lots of healthy mindsets you should take. For example:
    • Tsuyoku Naratai - "I want to become stronger!"
    • Notice when you're confused.
    • Recognize that being wrong is exciting, and something you should embrace - it means you are about to learn something new and level up!
    • Don't just believe the opposite of what your stupid opponent believes out of frustration and spite. Sometimes they're right for the wrong reasons. Sometimes there's a third alternative you're not considering.
    • To give something a fair chance, be sure to think about it for five minutes by the clock.
    • When you're wrong, scream "OOPS!". That way, you could just move on in the right direction immediately. Don't just make minor concessions and rationalize why you were only partially wrong.
    • Don't be content with just trying. You'll give up too early if you do that.
    • "Impossible" things are often not actually impossible. Consider how impossible wireless communication would seem to someone who lived 500 years ago. Try studying something for a year or five before you claim that it is impossible.
    • Don't say things to sound cool, say them because they're true. Don't be overly humble. Don't try to sound wise by being overly neutral and cautious.
    • "Mere reality" is actually pretty awesome. You could vibrate air molecules in an extremely, extremely precise way, such that you could take the contents of your mind and put them inside another persons mind? What???? Yeah. It's called talking.
    • Shut up and calculate. Sometimes things aren't intuitive, and you just have to trust the math.
    • It doesn't matter how good you are relative to others, it matters how good you are in an absolute sense. Reality doesn't grade you on a curve.
Sound interesting? Good! It is!

Eliezer wrote about all of this stuff in bite sized blog posts (he claims it helps him write faster). About one per day. Originally, the collection of posts were referred to as The Sequences, and were organized into categories. More recently, the posts were refined and brought together into a book - Rationality: From AI to Zombies.

Personally, I believe the writing is dense and difficult to follow. Things like AI are often used as examples in places where a more accessible example could have been used instead. Eliezer himself confesses that he needs to "aim lower". Still, the content is awesome, insightful, and useful, so if you could make your way past some of the less clear explanations, I think you have a lot to gain. Personally, I find the Wiki and the article summaries to be incredibly useful. There's also HPMOR - a fanfic Eliezer wrote to describe the teachings of rationally in a more accessible way.

Gaps

So far, there hasn't been enough of a focus on applying rationality to help you win in everyday life. Instead, it's been focusing on solving big, difficult, theoretical problems. Eliezer mentions this in the preface of Rationality: From AI to Zombies. Developing the more practical, applied part of The Art is definitely something that needs to be done.

Learning how to rationally work in groups is another thing that really needs to be done. Unfortunately, rationalists aren't particularly good at working together. So far.

Community

From 2009-2014 (excluding 2010), there were surveys of the LessWrong readership. There were usually about 1,500 responders, which tells you something about the size of the community (note that there are people who read/lurk/comment, but who didn't submit the survey). Readers live throughout the globe, and tend to be come from the atheist/libertarian/technophile/sf-fan/early-adopter/programmer/etc. crowd. There are also a lot of effective altruists - people who try to do good for the world, and who try to do so as efficiently as possible. See the wiki's FAQ for results of these surveys.

There are meet-ups in many cities, and in many countries. Berkeley is considered to be the "hub". See How to Run a Successful LessWrong Meetup for a sense of what these meet-ups are like. Additionally, there is a Slack group, and an online study hall. Both are pretty active.

Community members mostly agree with the material described in The Sequences. This common jumping off point makes communication smoother and more productive. And often more fulfilling.

The culture amongst LessWrongians is something that may take some getting used to. Community members tend to:

  • Be polyamorous.
  • Drink Soylent.
  • Communicate explicitly. Eg. "I'm beginning to find this conversation aversive, and I'm not sure why. I propose we hold off until I've figured that out."
  • Be a bit socially awkward (about 1/4 are on the autism spectrum).
  • Use lots of odd expressions.
In addition... they're totally awesome! In my experience, I've found them to be particularly, caring, altruistic, empathetic, open-minded, good at communicating, humble, intelligent, interesting, reasonable, hard working, respectful and honest. Those are the kind of people I'd like to spend my time amongst.

Diaspora

LessWrong isn't nearly as active as it used to be. In "the golden era", Eliezer along with a group of other core contributors would post insightful things many times each week. Now, these core contributors have fled to work on their own projects and do their own things. There is much less posting on lesswrong.com than there used to be, but there is still some. And there is still related activity elsewhere. See the wiki's FAQ for more.

Related Organizations

MIRI - Tries to make sure AI is nice to humans.

CFAR - Runs workshops that focuses on being useful to people in their everyday lives.


Meta:

Of course, I may have misunderstood certain things. Ex. I don't feel that I have a great grasp on bayesianism vs. science. If so, please let me know.

Note: in some places, I exaggerated slightly for the sake of a smoother narrative. I don't feel that the exaggerations interfere with the spirit of the points made (DH6). If you disagree, please let me know by commenting.

New meet up in Las Vegas!

2 adamzerner 28 April 2017 11:57PM

Hey guys, I'd just like to announce that I'm starting a new meet up in Las Vegas!

WHEN: First and third Sunday of the month. 7pm-9pm.

WHERE: The Market (downtown on Fremont Street).

See http://lesswrong.com/meetups/1xg.

Meetup : Las Vegas Meetup

0 adamzerner 28 April 2017 12:52AM

Discussion article for the meetup : Las Vegas Meetup

WHEN: 30 April 2017 05:52:32PM (-0700)

WHERE: 611 Freemont Street, Las Vegas, NV 89101

Email me at azerner3@gmail.com if you're interested in a Las Vegas meet up.

Discussion article for the meetup : Las Vegas Meetup

Should we admit it when a person/group is "better" than another person/group?

0 adamzerner 16 February 2016 09:43AM

This sort of thinking seems bad:

me.INTRINSIC_WORTH = 99999999; No matter what I do, this fixed property will remain constant.

This sort of thinking seems socially frowned upon, but accurate:

a.impactOnSociety(time) > b.impactOnSociety(time)

a.qualityOfCharacter > b.qualityOfCharacter // determined by things like altruism, grit, courage, self awareness...

Similar points could be made by replacing a/b with [group of people]. I think it's terrible to say something like:

This race is inherently better than that race. I refuse to change my mind, regardless of the evidence brought before me.

But to me, it doesn't seem wrong to say something like:

Based on what I've seen, I think that the median member of Group A has a higher qualityOfCharacter than the median member of Group B. I don't think there's anything inherently better about Group A. It's just based on what I've observed. If presented with enough evidence, I will change my mind.

Credit and accountability seem like good things to me, and so I want to live in a world where people/groups receive credit for good qualities, and are held accountable for bad qualities.

I'm not sure though. I could see that there are unintended consequences of such a world. For example, such "score keeping" could lead to contentiousness. And perhaps it's just something that we as a society (to generalize) can't handle, and thus shouldn't keep score.

Sports

12 adamzerner 26 December 2015 07:54PM

This is intended to be a pretty broad discussion of sports. I have some thoughts, but feel free to start your own threads.


tl;dr - My impression is that people here aren't very interested in sports. My impression1 is that most people have something to gain by both competitive and recreational sports. With competitive sports you have to be careful not to overdo it. With recreational sports, the circumstances have to be right for it to be enjoyable. I also think that sports get a bad rep for being simple and dull. In actuality, there's a lot of complexity. 

1 - Why does this have to sound bad?! I have two statements I want to make. And for each of them, I want to qualify it by saying that it as an impression that I have. What is a better way to say this? 

Me

I love sports. Particularly basketball. I was extremely extremely dedicated to it back in middle/high school. Actually, it was pretty much all I cared about (not an exaggeration). This may or not be crazy... but I wanted to be the best player who's ever lived. That was what I genuinely aspired and was working towards (~7th-11th grade).

My thinking: the pros practice, what, 5-6 hours a day? I don't care about anything other than basketball. I'm willing to practice 14 hours a day! I just need time to eat and sleep, but other than that, I value basketball above all else (friends, school...). Plus, I will work so much smarter than they do! The norm is to mindlessly do push ups and eat McDonalds. I will read the scientific literature and figure out what the most effective ways to improve are. I'm short and not too athletic, so I knew I was starting at a disadvantage, but I saw a mismatch between what the norm is and what my rate of improvement could be. I thought I could do it.

In some ways I succeeded, but ultimately I didn't come close to my goal of greatness. In short, I spent too much time on high level actions such as researching training methods and not enough time on object level work; and with school and homework, I simply didn't have enough time to put in the 14 hour days I envisioned. I was a solid high school player, but was no where near good enough to play college ball.

Take Aways

Intense work. I've gone through some pretty intense physical exercise. Ex. running suicides until you collapse. And then getting up to do more until you collapse again. It takes a lot of willpower to do that. I think willpower is like a muscle, and you have to train yourself to be able to work at such intensities. I haven't experienced anything intellectual that has required such intensity. Knowing that I am capable of working at high intensities has given me confidence that "I could do anything".

Ambition. The culture in athletic circles is often one where, "I'm not content being where I am". There's someone above you, and you want to beat them out. I guess that sort of exists in academic and career circles as well, but I don't think it's the same (in the average case; there's certainly exceptions). What explains this? Maybe there's something very visceral about lining up across from someone, getting physically and unambiguously beaten, and letting your teammates and yourself down.

Confidence. Often times, confidence is something you learn because you have to. Often times, if you're not confident, you won't perform, so you need to learn to be confident. But it's not just that; there's something else about the culture that promotes confidence (perhaps cockiness). Think: "I don't care who the opponent is, no one can stop me!".

Group Bonds. When you spend so much time with a group of people, go through exhausting practices together, and work as a team to experience wins and losses, you develop a certain bond that is enjoyable. It reminds me a bit of putting in long hours on a project and eventually meeting the deadline, but it isn't the same.

Other: There's certainly other things I'm forgetting.

All of that said, there are downsides that correspond with all of these benefits. My overarching opinion is "all things in moderation". Ambition can be poison. So can the habitual productivity that often comes with ambition. Sometimes the atmosphere can backfire and make you less confident. And sometimes teammates can bully and be cruel. I've experienced the good and bad extremes along all of these axes.

Honestly, I'm not quite sure when it's worth it and when it isn't. I think it often depends on the person and the situation, but I think that in moderation, most people have a decent amount to gain (in aggregate) by experiencing these things.

Recreational

So far I've really only talked about competitive sports. Now I want to talk about recreational sports. With competitive sports, as I mention above, I think there's a somewhat fine line between underdoing it and overdoing it. But I think that line is a lot wider for recreational sports. I think it's wide enough such that recreational sports are very often a good choice.

One huge benefit of recreational sports is that it's a fun way to get exercise. You do/should exercise anyway; why not make a game out of it?

Part of me feels like sports are just inherently fun! I know that calling them inherently fun is too strong a statement, but I think that under the right circumstances, they often are fun (I think the same point can be applied to most other things as well).

In practice, what goes wrong?

  • You aren't in shape. You're playing a pick up basketball game where everyone else is running up and down the court and you're too winded to breathe. That's no fun.
  • Physical bumps and bruises. You're playing football and get knocked around, or perhaps injured.
  • Lack of involvement.
    • You're playing baseball. You only get to hit 1/18th of the time. And you are playing right field and no one ever hits it to you (for these reasons, I don't like baseball).
    • You're playing soccer with people who don't know how to space the field and move the ball, and you happen to get excluded.
    • You're playing basketball where each team has a ball hog who brings up the ball and shoots it every possession.
  • Difficulty-skill mismatch. You're playing with people who are way too good for you, so it isn't fun. Alternatively, maybe you're way better than the people you're playing with and aren't being challenged.
  • Other. Again, I'm sure there are things I'm not thinking of.
For the most part, I feel like the things that go wrong are correctable, and once corrected, I predict that the sport will become enjoyable (some things are inherent, like the bumps and bruises in tackle football; but there's always two-hand touch!).

I even see a business opportunity here! Currently, these are all legitimate problems. I think that if these were corrected, a lot of utility/value would be generated. What if you could sign up and be provided with recreational games, with enough time for you to rest so you're not exhausted, where your teammates and opponents are respectful and considerate, where you're involved in the game, and where your teammates and opponents are roughly at your skill level.

Complexity

I sense that sports get a bit of an unfair rep for being simple and dull games. Maybe some are, but I think that most aren't.

Perhaps it's because of the way most people experience the game. Take basketball as an example. A lot of people just like to watch to see whether the ball goes in the hoop or not and cheer. Ie. they experience the game in a very binary way. Observing this, it may be tempting to think, "Ugh, what a stupid game." But what happens when you steelman?

I happen to know a lot about basketball, so I experience the game very differently. Here's an example:

Iguodala has the ball and is being guarded by LeBron. LeBron is playing close and is in a staggered stance. He's vulnerable and Iguodala should attack his lead foot. People (even NBA players) don't look at this enough! Actually no, he shouldn't attack: the weak side help defense looks like it's in position, and LeBron is great at recovery. Plus, you have to think about the opportunity cost. Curry has Dellavedova and could definitely take him. Meaning, if Delly plays off, Curry can take a shot, but if Delly plays him more tightly, Curry could penetrate and either score or set someone else up, depending on how the help defense reacts. That approach has a pretty high expected value. But actually, Draymond Green looks like he has JR Smith on him (who is much smaller), which probably has an even higher expected value than Curry taking Delly. But to get Green the ball they'd have to reverse it to the weak side, and they'd have to keep the court spaced such that the Cavs won't have an opportunity to switch a bigger defender on to Green. All of this is in contrast with running a motion offense or some set plays. And you also have to take into account the stamina of the other team. Maybe you want to attack LeBron on defense to make him work, get him tired, and make him less effective on offense (I think this is a great approach to take against Curry and the Warriors, because Curry isn't a good defender and is lethal on offense).

Hopefully you could see that the amount of information there is to process in any given second is extremely high! If you know what to look for. Personally, I've never played organized football. But after playing the video game Madden (and doing some further research), I've learned a good amount about how the game works. Now when I watch football, I know the intricacies of the game and am watching for them. The density of information + the excitement, skill and physicality makes these ports extremely enjoyable for me to watch. Alternatively, I don't know too much about golf and don't enjoy watching it. All I see when I watch golf is, "The ball was hit closer to the hole... the ball was hit closer to the hole... the ball was it in the hole. This was a par 3, so that must have been an average performance."

 

Non-communicable Evidence

9 adamzerner 17 November 2015 03:46AM

In this video, Douglas Crockford (JavaScript MASTER) says:

So I think programming would not be possible without System I; without the gut. Now, I have absolutely no evidence to support that statement, but my gut tells me it's true, so I believe it.

 

1

I don't think he has "absolutely no evidence". In worlds where DOUGLAS CROCKFORD has a gut feeling about something related to programming, how often does that gut feeling end up being correct? Probably a lot more than 50% of the time. So according to Bayes, his gut feeling is definitely evidence.

The problem isn't that he lacks evidence. It's that he lacks communicable evidence. He can't say "I believe A because X, Y and Z." The best he could do is say, "just trust me, I have a feeling about this".

Well, "just trust me, I have a feeling about this" does qualify as evidence if you have a good track record, but my point is that he can't communicate the rest of the evidence his brain used to produce the resulting belief.

 

2

How do you handle a situation where you're having a conversation with someone and they say, "I can't explain why I believe X; I just do."

Well, as far as updating beliefs, I think the best you could do is update on the track record of the person. I don't see any way around it. For example, you should update your beliefs when you hear Douglas Crockford say that he has a gut feeling about something related to programming. But I don't see how you could do any further updating of your beliefs. You can't actually see the evidence he used, so you can't use it to update your beliefs. If you do, the Bayes Police will come find you.

Perhaps it's also worth trying to dig the evidence out of the other persons subconscious.

  • If the person has a good track record, maybe you could say, "Hmm, you have a good track record so I'm sad to hear that you're struggling to recall why it is you believe what you do. I'd be happy to wait for you to spend some time trying to dig it up."
  • Maybe there are some techniques that can be used to "dig evidence out of one's subconscious". I don't know of any, but maybe they exist.

 

3

Ok, now let's talk about what you shouldn't do. You shouldn't say, "Well if you can't provide any evidence, you shouldn't believe what you do." The problem with that statement is that it assumes that the person has "no evidence". This was addressed in Section 1. It's akin to saying, "Well Douglas Crockford, you're telling me that you believe X and you have a fantastic track record, but I don't know anything about why you believe it, so I'm not going to update my beliefs at all, and you shouldn't either."

Brains are weird and fantastic thingys. They process information and produce outputs in the form of beliefs (amongst other things). Sometimes they're nice and they say, "Ok Adam - here is what you believe, and here is why you believe it". Other times they're not so nice and the conversation goes like this:

Brain: Ok Adam, here is what you think.

Adam: Awesome, thanks! But wait - why do I think that?

Brain: Fuck you, I'm not telling.

Adam: Fuck me? Fuck you!

Brain: Who the fuck do you think you're talking to?!!!

Just because brains could be mean doesn't mean they should be discounted.

What is your rationalist backstory?

7 adamzerner 25 September 2015 01:25AM

I'm reading Dan Ariely's book Predictably Irrational. The story of what got him interested in rationality and human biases goes something like this.

He was the victim of a really bad accident, and had terrible burns covering ~70% of his body. The experience was incredibly painful, and so was the treatment. For treatment, he'd have to bathe in some sort of disinfectant, and then have bandages ripped off his exposed flesh afterwards, which was extremely painful for him.

The nurses believed that ripping it off quickly would produce the least amount of pain for the patient. They thought the short and intense bursts of pain were less (in aggregate) than the less intense but longer periods of pain that a slower removal of the bandages would produce. However, Dan disagreed about what would produce the least amount of pain for patients. He thought that a slower removal would be better. Eventually, he found some scientific research that supported/proved his theory to be correct.

But he was confused. These nurses were smart people and had a ton of experience giving burn victims baths - shouldn't they have figured out by now what approaches best minimize patient pain? He knew their failure wasn't due to a lack of intelligence, and that it wasn't due to a lack of sympathy. He ultimately concluded that the failure was due to inherent human biases. He then became incredibly interested in this and went on to do a bunch of fantastic research in the area.

In my experience, the overwhelming majority of people are uninterested in rationality, and a lot of them are even put off by it. So I'm curious about how members of this incredibly small minority of the population became who they are.

Part of me thinks that extreme outputs are the result of extreme inputs. Like how Dan's extreme passion for his work has (seemingly) originated from his extreme experiences with pain. With this rule-of-thumb in mind, when I see someone who possesses some extreme character trait, I expect there to be some sort of extreme story or experience behind it.

But another part of me thinks that this doesn't really apply to rationality. I don't have much data, but from the limited experience I've had getting to know people in this community, "I've just always thought this way" seems common, and "extreme experiences that motivated rational thinking" seems rare.

Anyway, I'm interested in hearing people's "rationalist backstories". Personally, I'm interested in reading really long and detailed backstories, but am also interested in reading "just a few paragraphs". I'm also eager to hear people's thoughts on my "extreme input/output" theory.

Why Don't Rationalists Win?

6 adamzerner 05 September 2015 12:57AM

Here are my thoughts on the "Why don't rationalists win?" thing.

Epistemic

I think it's pretty clear that rationality helps people do a better job of being... less wrong :D

But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.

I'm not really elaborating here because I assume that this is something that we agree on.

However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.

Altruistic

I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:

  • I'm going to assume that we mostly agree on, and thus not really elaborate.
  • I think deserves to be noted and given credit.
  • Is useful.

For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:

  1. High level.
  2. Likely to be realized far in the future.
  3. Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).

But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.

Success

I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.

I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...". 

I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.

But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.

  1. Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
  2. Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
  3. Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
  4. Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
My point in this section is that rationality can help with success, but 1-4 are also extremely important, and probably act as a limiting factor for most of us (I'd guess that most people here are rational enough such that 1-4 probably acts as a barrier to their success, and marginal increases in rationality probably won't have too big a marginal impact).

(I also bet that 1-4 is insufficient and that there are important things I'm missing.)

Happiness

I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.

Luke talked about the correlates of happiness in How to Be Happy:

Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18

One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.

Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:

  • Things like social and romantic relationships are tremendously important factors in one's happiness. So is work satisfaction (in brief: autonomy, mastery and purpose).
  • These are things that you could certainly get without rationality. Non-rationalists, have set a somewhat high bar for us to beat.
  • Rationality certainly COULD do wonders in this area.
  • But the art hasn't progressed to this point yet. Doing so would be difficult. People have been trying to figure out the secrets of happiness for 1000s of years, and though I think we've made some progress, we still have a long way to go.
  • Currently, I'm afraid that rationality might be acting as a memetic immune disorder. There's a huge focus on our flaws and how to mitigate them, and this leads to a lot of mental energy being spent thinking about "bad" things. I think (and don't know where the sources are) that a positive/optimistic outlook plays a huge role in happiness. "Focusing on the good." Rationality seems to focus a lot on "the bad". Rationality also seems to make people feel unproductive and wrong for not spending enough time focusing on and fixing this "bad", and I fear that this is overblown and leads to unnecessary unhappiness. At the same time, focusing on "the bad" is important: if you want to fix something, you have to spend a lot of time thinking about it. Personally, I struggle with this, and I'm not sure where the equilibrium point really is.

Social

Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.

Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".

Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?

*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.

Failure?

So, are we a failure? Is everything moot because "rationalists don't win"?

I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has

A LOT

of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?


(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)

View more: Next