I believe that the number of smart rationalists has a profound influence on the quality of the future, so I would like to spread rationality at the maximum possible rate. In fact I believe this is the most important problem which I can work on right now. This post describes some of my beliefs about rationality outreach. The purpose of this post is threefold. First, by exposing my beliefs to criticism I hope to improve them. Second, by communicating my beliefs I hope to help others with similar goals. Third, by engaging in more concrete discussion of rationality outreach I hope to increase the probability that others will work seriously on the problem.


What Should be Taught?

"Spread rationality" is a rather vague and unhelpful goal. Here are three behaviors / habits of mind which I believe would make the world a much better place if they caught on among smart people.

1. Desire for accurate beliefs. Many people don't feel that improving the quality of their beliefs is worthwhile. This is a combination of a failure to recognize exactly how wrong many of their beliefs are and a general perception that it doesn't matter anyway. Society as a whole is assumed to be "basically right," even by people who verbally acknowledge rather serious failures of humanity's collective rationality. This is related to the perception that there is no free lunch: as humans we resist the suggestion that we can get anything for free, and acting intelligently seems to be considered too easy. The other side of this bias is a reluctance to believe that incorrect beliefs can really do any damage.

2. Mindful living. Many people direct incredible intelligence towards solving problems they encounter in their work, but fail to apply that intelligence to understanding the sources of their own beliefs/emotions, clarifying/refining their goals, or considering how to direct their resources to obtain those goals.

3. Initiative and confidence. It is very easy to believe that the world is a crazy place but that an individual has no power to change it. Regardless of how true that may be (my experience suggests not very), the world would probably be a better place if more really intelligent people internalized the belief that their actions change the world. Believing that your life is high stakes--that the potential upside is large--seems to be well correlated with doing great good.

Why Would People Listen?


Unfortunately, the majority of smart people don't believe that improving their own rationality is worthwhile (related to point 1 above). Someone who considers rationality a waste of time is unlikely to engage seriously with material presented in the context of rationality training, and they are likely to be driven off rapidly by perceived proselytizing. They are unlikely to take seriously a website dedicated to rationality, read a book dedicated to rationality, go to a talk about rationality, etc. Moreover, spreading rationality is not really about transferring knowledge. In order to change someone's behavior, they need a good reason to file advice about "rationality" into the compartment that actually affects behavior, and in general "because rationality is useful" does not constitute a sufficiently compelling reason (even if it is accepted as fact in the compartment for controlling beliefs).

Finding ways around this problem is one of the main difficulties of rationality outreach. I know one approach which seems to work, at least anecdotally and from personal experience: attach rationality as a rider on content intelligent people want to consume, and which requires or encourages engagement on its own terms. Some examples:

1. Educational materials. Though smart people don't generally want to improve their own rationality, they generally do want to learn. Interesting topics presented clearly for an intelligent reader seem to have a wide audience. Many subjects do not yet have particularly excellent expositions on the internet, and good teachers/mentors are always in short supply.

2. Competition. Interesting forms of competition exert a strong draw on many smart people. I believe that the world could support more formal competitions aimed at a broad range of ages.

3. Entertainment. The success of HP:MoR is some indication of the desire for intelligent science fiction / fantasy. In the case of video games there is an incredible shortage of high-quality games targetted at a really smart audience. Filling either of these vacuums well can draw a huge audience.

My belief is that intelligent rationalists can produce content significantly above common standards (from the perspective of an intelligent young person) in any of these forms.

How Is Rationality Attached?

Having an audience is not automatically useful. How do you convert people engaging with your content into people changing their behavior?

1. Choose effective content. Most critically: choose content which communicates the lessons you want to impart. Teach classes in a way that suggests that more is possible, encourages curiosity, conveys the usefulness of the material and its ability to anticipate and control real things, and gives a really smart student enough background and confidence to attack some important questions on their own. Run competitions and design games at which rationalists really do win, in which defeat really does suggest that more is possible, and in which your opponent's / other participant's innate virtuosity is not an excuse.

2. Force decompartamentalization. My experience is that once a source manages to change any aspect of your life, it becomes immensely more likely that it will change your life in other ways. Though this wades into dark arts territory, exploiting it seems to be important. If you introduce a student to a field they come to care about and do work in, the probability of affecting their behavior in other ways is increased enormously. More subtly, if you run a program or competition that requires any material investment (perhaps just going to a physical location) then its probability of affecting behavior may be significantly increased. If you design content that forces the audience to step back and re-evaluate anything at all about their life, then your chances of having a serious impact are seriously increased.

3. Foster the impression that individuals influence the world, not just the other way around. Talking about rationality explicitly in general is probably not going to get traction except in special cases. Trying to convince someone that their actions change the world is flattering enough that it might work if there is not enough resistance. If you really believe this, then the quality of your beliefs starts to matter. The possibility of real improvement becomes concrete, and understanding why improvement is hard becomes a priority. I think I might be being a little too optimistic here by believing that you can do any good in this way, but it deserves trying.

4. Channel the community. If given the opportunity, communitites often crystallize around engaging content. I have little expertise at influencing communities, but I can say empirically that the direction of community discussion and involvement can be significantly influenced by authority, either directly or indirectly.

Who is the Audience?

I believe that outreach is more likely to succeed and more important when the audience is intelligent. Until I have a better understanding of effective outreach, I intend to focus on an extremely small slice of the population at the upper end of general intelligence. For example, I am interested in outreach to high school students. I currently intend to target materials at what I believe to be the top ~10,000 students based on my experience. Details this precise are likely to change, but this at least gives a sense of what I am talking about.

How?

Supposing I decide that trying desperately to spread rationality is the best use of my time: how do I get there from here? What do I actually do? Of course a reasonable plan (in great generality) is to earn as much money as possible and give it to someone you trust to spend it as well as possible. The question is whether it is possible to do better. I believe that in some cases it is. Here are my thoughts on what we can do.

1. Invest free time. Cultivate hobbies and interests that allow you to spread rationality for fun, or when you need a break from work. Teaching classes or engaging in mentorship can help you understand how people learn and how they react to attempts at spreading rationality; acquire evidence and share it. Develop online resources that interest you--games, fiction, interactive tutorials, blogs--which might spark an interest in rationality or self-improvement. More importantly, try and gain an understanding of how smart but not necessarily rational people respond to efforts. If you can't find anything that interests you enough to do for fun, if you think the problem is really important you can view the project as work on the side. 

2. Make money. People pay money to participate in academic programs, advertisers pay money for access to a well-defined audience, and people consistently pay money for valuable content. However you feel about this, I currently believe rationality outreach is only going to succeed on a massive scale when it is backed by a combination of passionate rationalists and a good enough business model to keep them alive and support rapid growth.

3. Engage in meta-outreach. I have started by describing my thoughts on how to engage in effective rationality outreach rather than why I believe rationality outreach is important, but I would have started the other way around if the atmosphere on LW were different. Regardless, you should think carefully about the importance of rationality outreach. If you believe that the problem is really urgent and have a cogent argument, you should make every effort to encourage other rationalists to engage with the problem. I also believe that thinking seriously and concretely about how to best achieve a precise goal is an excellent and necessary drive for improving our rationality, both individually and collectively.

What are My Plans?

I have an unusually good background in theoretical computer science and mathematics, know people who have a lot of access to high school students (and have good familiarity with a large clique of brilliant highschool students in the US), and am financially stable. I realize that not giving the money to any fixed charity is tantamount to believing that charity should be giving money to me. I am currently on track to become a research scientist, and it is going very well. I have not been thinking about rationality outreach for long, but the calculus has come out overwhelmingly in its favor and I am rapidly updating my beliefs. I am not yet willing to disrupt my current life plan in a way that would be difficult to reverse.

I believe the best first step for me is to start using my spare time (which I can make as large as 30 hours a week) to develop and evaluate online learning/testing resources. Unless I encounter an unexpected obstruction, I would then like to start recruiting other types of talent, testing more broadly, and looking for a business model which can effectively support continuing development along these lines. I may well fail to execute this plan, but it would be either because my beliefs changed or a better plan suggested itself (or, most likely, both).

Independently, I am interested in running an academic program for smart high school students. The existence of successful summer programs with a strong academic focus suggests that the market exists, and I believe that I could get enough support to run a program at least once (with future outings facilitated by success). I believe the main obstacles are the non-academic difficulties of running such a program and the recruitment of talented faculty, and the main benefits are probably an increased understanding of how to run a program and how to engage with smart high school students.

New Comment
30 comments, sorted by Click to highlight new comments since:

TL;DR: A rationalist Tim Ferris/Bill Gates/J.K.Rowling will be more effective in spreading rationality than a rationalist Billy Graham. Also, the more the better.

--

There is a lot of talk about creating more rationalists recently. But we also know that pushing our beliefs on others directly tends to activate their memetic immune system. Wouldn't a better approach be to create winners (in the conventional sense, as in, making loads of money, producing awesome things) who will then be able to effectively pull rather than push; "Want to win like me? Become more rational. Here's the material that helped me win".

Given that the basis of our epistemology is empiricism, this should be a convincing argument to us as well. If someone was to invent a process or system of thought that would reliably lead to more win, other things being equal, I have a hard time imagining why this community would not adopt it. That makes it not-Dark arts in my book. In fact, I think a large portion of the effectiveness of the Sequences and HP:MoR is that the readers detect a high awesomeness factor in Eliezer personally and would like to have more of that themselves.

In that sense, perhaps rather than guiding the best rationalists here to devote their lives to existential risk prevention, FAI research, or rationality outreach, we should be encouraging at least some* of them to use rationality to win big/become more awesome -in mainstream terms-, so they can operate as living case studies, directly visible and verifiable from our target audience's point of view. The more rationality is directly responsible for the win, the better. By acting as a living proof of rationality, they may be do more for spreading rationality and averting existential risks than if they worked at that directly. (And if they fail, then we need to examine why these failures occurred and improve our methods until we get it right; another reason to invest effort in short-term wins rather than putting all our eggs in the long-term win basket.)

* The ratio of people that should devote their energies to short-term success to those who should focus on long-term success (FAI research etc.) is not 100% but it is also not 0%, at least not by default. The ratio should depend on how imminent we consider a singularity to be, and the average winner's [0 -> success] interval as well as the impact we predict that success to have in creating more rationalists. This next generation of rationalists should again will divide among those working to become examples and those working directly on long-term outcomes. The goal presumably is to make rationality self-propagating and eventually to help correct humanity's resource allocation towards the more rational (e.g. avoiding existential risk among other things).

There is a lot of talk by financially successful people about how they became successful. We should understand if and how that talk actually affects behavior. My impression is that it does surprisingly little. Essentially, I am not convinced about the assertion that Bill Gates is better at changing behavior than Billy Graham. On the other hand, being able to point to successful examples who publicly endorse the pointing will undoubtedly help.

I think that engaging in value creation seems important as part of outreach. I believe that someone who designs amazing video games, for example, wins a lot of influence with their audience. In addition, they can make an effort to structure their game such that popular discussion about the game has a certain flavor, and that people who play the game engage with certain ideas while playing. (And they can carefully construct an online environment where fans of the game crystallize into a community.) The same can be said of the author of fiction, a designer of successful academic competitions or programs, or a designer of online learning materials.

I believe a smart rationalist who sets out to design the best online destination for may be able to succeed for a broad range of foo, and that this success can be much more rapidly obtained and is if anything more valuable for spreading rationality than spectacular financial success.

I think that by the standards of a high school version of myself (and presumably of a reasonable number of other high schoolers) I am already reasonably successful. I was accepted at MIT, and have developed a good enough research record while at MIT to probably get into any graduate school I want. Can this sort of success be used to influence high school students in the near term? I have some evidence that I can implicitly leverage my situation to get some high school students to come to a summer program I run. Would the success of such a program be more or less helpful than traditional financial success?

I've also done well academically but I am talking about success on a different order of magnitude, repeated for several people and causally related to an increase in rationality, not merely a halo effect.

For instance, Paul Graham has been incredibly successful in guiding the world towards his preferred outcomes (Hacker News, YCombinator, Startup Visa, lots more), and his first step was to build and sell a startup for approx $150m. Not everyone will agree with what he has to say, but with a calling card like that, they have to take his point of view seriously, and he's good at leveraging his success to create more Paul Grahams (making himself even more successful in the process). That's the sort of thing I have in mind.

Edit: To summarize, if Paul Graham is making a killing with domain-specific rationality, surely we should be raking in the billions or at least doing as well with general rationality? If not, why?

I understand what you mean by successful. I am not particularly successful by society's standards, nor by my current standards.

I agree that if a handful of overtly rationalist, spectacularly successful entrepreneurs emerged, it would lend incredible credibility to outreach efforts. After thought, I agree this would be more helpful than almost any other form of success. But there isn't any evidence that being rational (or even being identical to a wildly successful entrepreneur) can let you make $100M reliably. I hope it does, and I'm sure plenty of rational (in the style of LW) people will try.

People have been analogizing the art of rationality to punching and the art of overcoming akrasia to kicking. This is a way of explaining why rationality alone is incomplete. Others have seen the lack of spectacular results by rationalists as evidence of its non-utility. I can improve on this and offer new analogies that are closer parallels.

Rationality is not like punching, it is like blocking. As has been said, it is the art of not being stupid. As such, it is neither necessary nor sufficient for success. (Luck)x(Perseverance)x(Talent)=Success, and rationality decreases the amount of luck that is needed without providing perseverance (overcoming akrasia, striking) or talent (natural athleticism).

Regarding health, a hunter-gather lifestyle is healthier than a primitive farming one. Farming took over the world by being better for its societies, not its societies' individuals. Yet, only farming could have provided the infrastructure that gives us modern medicine. By analogy, there's hope yet for formal rationality. Gradually improving systemization will eventually outstrip what has evolved to be the natural path to success, so long as it can keep improving.

paulfchristiano:

Desire for accurate beliefs. Many people don't feel that improving the quality of their beliefs is worthwhile. This is a combination of a failure to recognize exactly how wrong many of their beliefs are and a general perception that it doesn't matter anyway.

This is a tremendously important issue, to which I think the post doesn't give the necessary attention. (It's also generally neglected on LW whenever topics like these are discussed.)

Here on LW, there is often a tacit assumption that changing one's beliefs towards greater accuracy is always a good and rational thing to do. However, this neglects two crucial factors. First, re-evaluating and changing one's beliefs has a cost in time and effort, which must be offset by some benefits to make it rational (in the economic sense at least). Second, when it comes to beliefs whose relevance is primarily of signaling (rather than instrumental) character, one may well be better off having an incorrect belief with superior signaling properties.

Therefore, when people feel that improving the quality of their beliefs is not worthwhile, there is at least a possibility that they might be right about it, and it's completely unjustified to dismiss this attitude as false or misguided out of hand.

This is related to the perception that there is no free lunch: as humans we resist the suggestion that we can get anything for free, and acting intelligently seems to be considered too easy.

But this is a very accurate heuristic. It's basically a correct application of the weak efficient markets hypothesis. When it comes to beliefs that have no instrumental implications, it's relatively easy to learn about biases from sources such as LW sequences and then use this knowledge to find and correct a bunch of incorrect beliefs. However, when it comes to beliefs that are relevant for practical action, figuring out how to improve those is extremely difficult. There is no straightforward way at all to apply LW sequence-style general intellectual skills profitably.

Improving your beliefs may not be the best way to become successful, but my goal isn't increasing the number of successful people in the world. What I care about (tautologically) are my goals, and those are well served by convincing intelligent people who care about the world that having correct beliefs is essential to doing good. The efficient market hypothesis has nothing to say about this assertion. Beliefs that have "no instrumental implications" are important to understanding the long-term consequences of our actions. Of course now I am relying on something more than the self-interest and intelligence of the audience, and the actual motive power of abstract morality is up to debate (and perhaps a more important thing to be attempting to spread).

I don't give this issue much time because I judged that most people on LW wouldn't disagree--elaborating would be preaching to the choir. I agree it deserves discussion.

That said, I believe your applications of the efficient market hypothesis are generally not completely accurate. For example: people make a great deal of money by doing something obvious which anyone could do. Why? Being smart and having initiative is one sort of advantage, and the efficient market hypothesis says nothing about a rationalist's ability to turn that advantage into profit.

Is there anybody here currently following the successful entrepreneur route of spreading rationality? I am part of the LW/OB group in NY, I have had more than reasonable success as a self employed business consultant, and I've found that creating tangible results for clients makes them more amicable hearing a pitch on rationality later on. Aiming high (C-level execs, prominent media personalities) and developing personal relationships have been essential to my outreach process.

Nice posting. A few random suggestions:

What Should be Taught?

  1. Desire for accurate beliefs... 2. Mindful living... 3. Initiative and confidence...

I would add one thing: 4) Coordination Is Hard and communication ain't always easy, either. But catalysing, organizing, and participating effectively in collective activity is a skill that can be acquired with some effort.

How do you convert people engaging with your content into people changing their behavior?

1) Choose effective content... 2) Force decompartamentalization... 3) Foster the impression that individuals influence the world... 4) Channel the community...

Good analysis. Here are some ideas about how to do this at three levels of engagement.

The first level consists of blogs, free online fiction, and contests/competitions targeted at smart high-school students. HPMoR is fun, but I would like to see something a little less fantasy-oriented and with slightly older protagonists. I don't know, but maybe a novel about students in a rationality dojo. Each of the protagonists is handicapped by a different cognitive bias. They make use of the formal lessons in their everyday lives, and thus gradually become more awesome. I know it sounds a bit formulaic, but a talented writer could probably pull it off.

The second level consists of a summer training program. I would recommend that this training include a physical component. A physical challenge like Outward Bound. But also, the participants should produce some tangible, physical things - items of camp infrastructure, their own meals, whatever.

The third level consists of clubs at universities. Going away to school is a big decompartmentalizing event for most people. Make use of it to extend and to reinforce the summer training session, and also to train and organize a cadre to keep the movement going.

I think that developing optimized rationality seeds is a particularly promising route for the first step. Once designed it should be easy to copy and distribute, the target audience will do the bulk of the work themselves and can actually help plant more seeds. I guess by the time it gets to be self replicating it's more of a 'virus', but I don't see any ways in which simple mutations could turn it to work for evil, since a major design feature it to robustness in getting people led towards the 'rationality' attractor.

More involved projects like teaching rationality in kindergarten are essential for complete success, but they don't seem like the best first move.

I think that developing optimized rationality seeds is a particularly promising route for the first step.

This should certainly be done, but its a hard problem to attack. The environment in which the seed is encountered has some significant influence on how a reader will interact with it. If you like, you can view my problem as designing and drawing an audience to an environment in which existing persuasive techniques are adequate.

I think if you really believe that it is possible to design a virus in this way, you should work at it more aggressively. I am not too optimistic, largely because of the difficulty of experimentation and testing.

We live in a sea of viciously competitive memes, many created by people for a living - books, songs, advertising, culture in general.

Although science has learnt a huge amount about human cognitive biases in just the past few decades, it would be an error to assume that we therefore knew nothing before that. We knew really quite a lot, enough to have excellent working knowledge of how to approach the problem of creating a viable meme. And it turns out that if you want to create compelling memes, throwing lots of people at the problem, who all want the meme with their name on it to succeed, works quite well.

Designing a really compelling meme is in the "get other people to do what you want" class of problems that we grew brains for. It's something that plays to our natural aptitudes.

So just coming up with rationality seeds and throwing them out there into the memetic soup is a reasonable first approach, I think. Unless the seed is actually defective, it's very unlikely to do actual damage.

It seems like a good idea to explicitly discuss the presentation of the meme and how to effectively spread it (in particular, how many people should do it without risking damage). I also imagine an important ingredient is tools for quantitative analysis. For example, how easy is it to design links to LW or other resources that count click-throughs? How legally/socially questionable is it to compute aggregate statistics like return rates or retention time (for people with static IPs) for the users who clicked through?

What happens to the popular perception of causes which are frequently (hopefully indirectly) advanced on online communities such as Reddit? I can easily imagine rationality "seeds" doing as much harm as good on the internet, frankly.

LW is not a very good place to link to. The creation of more condensed resources to point someone to actually is probably actually more important than any of these considerations. In particular, you probably only get something like 30 seconds to convince someone that they should stay. You only get a little more after that to convince them that it is even possible that you have something new and interesting to say. You also need to have some hope of eventually convincing a skeptic, which its really not clear LW can do (it provides a lot of ammunition against itself if a normal person has an internal argument about whether, say, reading the Sequences is a good idea). Less Wrong failed to convince me that I should care after several hours of visiting (I would certainly not be here if I didn't interact in person with any other users), and I would describe myself as a pretty easy sell.

Well, we've had some threads to workshop ideas (1, 2). Results are variable - but that's okay. The main thing I would suggest is to keep brainstorming ideas.

(e.g. perhaps a new rationality blog that isn't LessWrong. Perhaps a hundred new rationality blogs that aren't LessWrong. I keep telling ciphergoth to start blogging the things he says so eloquently and concisely in person ... I posted today in my journal about a matter that's arguably of rationalist concern in a world of woo - and it's been getting great comments - so it doesn't even require surmounting the barrier of bothering to set up a whole new blog. Make rationalist-interest posts to your own blogs!)

Less Wrong failed to convince me that I should care after several hours of visiting (I would certainly not be here if I didn't interact in person with any other users), and I would describe myself as a pretty easy sell.

Since you are definitely the type of person I want LW to attract, I would be interested in anything you could remember about those first several hours of visiting.

In particular I am interested in the effect of the talk here about AGI research's being at the same time a potent threat to human life and human civilization and an extremely effective form of philanthropy.

At the time I said "LW readers would be uniformly better served by applying their rationality than developing it further."

The sentiment is still somewhat applicable; the difference between then and now is that then I believed that my own rationality had reached the point where improvements were useless. LessWrong did nothing to convince me otherwise, where this should have been its first priority if it was trying to change my behavior.

(My first real posts to LessWrong were amusing games vaguely related to a naive conception of safe AGI)

LW readers would be uniformly better served by applying their rationality than developing it further.

When you wrote that, did you mean applying it to the problems of society, to personal problems like wealth creation, or to both?

I recognize the difficulty and environmental dependence, and I still think it's the way to go.

In the comment below, you say that you spent several hours here not being convinced and that resources need to be more condensed so as to get people to stick around.

That's exactly the type of thing I'm talking about. Spend the first 30 seconds getting them to stay for a bit longer, and then that time sinking the hook deeper by making condensed arguments that they'll accept as at least plausible and will need to look into- or something like that. It's not an all or nothing thing, and I think there's room for a lot of improvement on the margin.

I am working on a very specific form of the problem very aggressively. Still mostly building skills/understanding that will allow me to tackle the problem though, so the effort is in the same direction as the general problem.

I think you should try being a bit more explicit about what your actual goal is.

Improving the rationality of the bottom half is much different than improving the rationality of the top percent (or any other subset).

I suspect that if you came up with some instrumental rationality metric measuring actual effectiveness when not just following the herd it would follow a power law distribution, and when you're dealing with power laws it is critical to determine which tail you should spend all your effort on.

Of course, people do get to free ride on a lot of decisions, and this complicates things quite a bit, but it doesn't change the fact that your target audience is a critical choice that needs some real analysis and a more specific goal.

My guess is that more intelligent/rational people will be easier to get a hook in (less bootstrapping problem) and have the potential for more improvement and influence, and that this is where all the effort should be spent.

My immediate efforts are focused on roughly the top 0.1% of high school students measured by general intelligence. If I could I would focus on the top 0.01%.

I agree this should be described in the original post.

Is there another sense in which I should be more explicit about my goals?

Your goals could be something like improving personal relationships, allowing a better legal system to take over, or generating funding or talent for SIAI, which would require increasingly good rationalists.

If I had to guess, I'd guess that you're going for the 'other' category because it includes all sorts of important things which add up/may dominate, and figure that top 0.1%-0.01% is about right.

That doesn't sound like a bad answer at all, and I'm not sure how I'd change it. I'm just emphasizing the point that the choice matters a lot, and that extra thinking helps.

It's the kind of thing I'd want to have a discussion about at the next LW meetup, and by the time I start actively teaching people I'd want to have thought about it enough that I can't bring someone up to speed with my full reasoning in a minute or two.

Unfortunately, the majority of smart people don't believe that improving their own rationality is worthwhile (related to point 1 above).

The phrasing you use implies a conscious decision. In an over the top manner you might imagine a person who gets mind-saved, trained into rationality+, presented with all the data on how that influences their lives, and then decides to get memory wiped back to the safe point (and time transfered too). That would be a conscious decision under full information. Obviously that is not the case. I think that the idea that there is more in the world that is amazing and useful and worthwhile to learn is not particularly widespread. So any information about this topic gets possibly discarded by the very same system it tries to improve. A bit of a bootstrapping problem. You might get around that by carefully selecting people with good preconditions, or by a wide spread of the material. The rest of the world might only notice if many of those who participates a certain training becomes successful, and if many of the successful people did a certain training program.

More simply, I believe that most smart people when asked, would say that they are already rational, and not realize that there a great many degrees of rationality and that they probably have subconscious biases working against them.

being aware of what you do not know times a hundred

The phrasing you use implies a conscious decision.

I think it is safe to say that when the issue comes up for consideration, most smart people make a conscious decision not to waste time improving their rationality. It may not be the best possible decision, but we rarely make the very best possible decision.

The idea that there is more in the world that is amazing and useful and worthwhile to learn is not particularly widespread.

I think many smart people continue to learn new things about the world. This doesn't generally translate into improving their rationality.

I'm a junior in high school, if you want somebody to bounce ideas off of. My email address is nojustnoperson@gmail.com.

What category would it fall under to start (and competently market) a community website focused on some particularly relevant topic? I guess this is sort of alluded to in the How section, but let me expand on it a little.

One thing I've noticed is that there are some pretty slick CMS systems out there these days, for example BuddyPress which is layered on WordPress. It has a karma plugin for forum posts. I have been considering putting together a cryonics site from these components -- it almost seems decent enough to compete with lesswrong, but is much easier to set up than Reddit.

You could probably take just about any hobby/interest group and overlap it with rationality training. For example groups that develop plugins and themes for web sites could use rationality as both a brand and to help them do a better job.

I wish you luck trying to spread rationality, you will probably need it.

Also, if you do make a program for high school students then depending on the location I may be interested in attending. (Technically not a high school student, but I am 17) And I can help if you want feedback on whatever ideas you have.

I'm in the Montgomery Blair High School Math, Science, Computer Science Magnet Program, and would be happy to talk to you about high school stuff.

I could also give you contact info for the Program Coordinator and such.