I want to help the world as much as possible: lets define that as reduction of suffering1 of the human race.

I am an engineering student. It would be very simple to spend the rest of my life working on engineering problems, thereby making humanity more effective, and so reducing suffering: Pretty acceptable in terms of benefit to the world, and also given my skill set it makes sense. The issue here is that it may be a local maximum.

Now, what if instead of directly solving problems I focus my efforts on other people? Directing others down a more rational path. This may include things like telling people about Less Wrong, specifically addressing their direction in life, or providing information about effective charities instead of ineffective ones. This seems like a massively more efficient solution space, even if my social skills are weak. Compared to the effectiveness of a single engineer, the life paths of hundreds of people with improved rationality would be enormous.

Here I make the assumption that more rational thought reduces suffering. I would argue this to be the case because a less rational thinker is going to act in more conflicting, arbitrary, and harmful ways than otherwise. Clearer thinking then means better judgements, and we also say that on average people's actions improve society as evidenced by society progressing to this point.

Still, directing others is not particularly efficient for me considering my skill set. This task should be delegated to another with a skill set more suited: a people person. This would then allow me to get back to my own efficient problems. But here is the question: How many "people people" is it efficient to train? Isn't my time still better spent as an ineffective people person even above my direct problem solving even at that point? If so, shouldn't we all be focusing on expanding our numbers?

By doing so will we increase the quality of our discussions by bringing in many new ideas and faces, or will more people hinder quality? Should we instead simply delve deeper into the depths of rationality growing only as people discover the site organically? What is the ideal community growth rate? (quality versus quantity? If Less Wrong expands too quickly is that an issue?)

We could also produce some kind of rationality book or other set of materials if we intended to improve rationality without inherently expanding the community by linking them. (Anyone super interested would go looking for more, and we'd definitely want them.)

I may be overestimating how much other people's lives can be influenced, but it seems a high probability that after even a year of dedicated... irrationality sniping?... on even a bimonthly basis (every other Saturday lets say) would have a profound effect on at least one other individual. Especially given the internet's connective ability it seems that the ability to improve the cognitions of other people is cheap and high impact compared to direct problem solving.

If I am not solving those engineering problems will someone else be? The question then becomes how much I value the reduction of the world's suffering versus my other values rather than the actual effectiveness of irrationality sniping versus direct problem solving.

Conclusion: To reduce world suffering I should reduce the irrationality of others, then convince those persons to do the same.

 

Should there be a lesson plan of sorts for aspiring rationalists? There are the sequences, and people interested enough will find their own way around the site, but I am wondering if there is some more directed thought process we want people to go through, or if perhaps randomly encountering ideas might actually be better, especially if we are trying to expand rationality in all directions.

Regardless of the answers to these questions I intend to print out flyers and scatter them across my college campus. Is there already a resource like that I can just print off? I figure even if I drop off the face of the earth I'll have done some good that way.

 

1. Without uh, killing everyone or tiling smiley faces.

ps. Sorry if this comes off as rambling. No idea what I am doing. Never stopped me before.

New Comment
22 comments, sorted by Click to highlight new comments since: Today at 4:49 AM

You can do earning to give and then donate towards another person doing the outreach. At the moment CFAR is probably the best address.

Yes. Imagine my life's goal is to maximize the quality of the house I will own in the year 2045. I almost certainly shouldn't seek to build this house myself, but rather try to maximize my income so I could buy the best house. I will have to know a lot about houses so I can know which is the best, but only through a remarkable coincidence would my path to maximum income be via home construction.

An engineering salary for mechanical engineering is ~ 100k/year (http://www.mtu.edu/engineering/outreach/welcome/salary/), After paying off student loans and rent and food I would donate maybe 70k of that each year. (assuming I satisfy none of my other values). If I could convince ten people to give 10k a year then I would have done better than working full time as an engineer. This seems reasonable given a year, no job and access to the internet. Even better still if I could instead convince five people to convince three people each.

Your example breaks down because I am not trying to build the house, but rather directing others' income towards its construction. I might also additionally direct others' career paths to create a high quality construction company such that when the time comes the quality of the house is even higher than it would be otherwise. We do have thirty years or so after all.

This post from Alyssa Vance might be helpful to you.

The argument seems to be: By becoming another level of meta I would simply be redirecting the flow of money rather than producing value.

I would argue however then that the vast majority of those charities are inefficient and therefore not worth donating to. Hence by directing money away from them value is actually being produced by the reduction of inefficiency. Thus if I can reduce the inefficiency by more than my donation amount I doing more social good than if I were to donate to them myself. Additionally, if I convert people to become more rational they may produce value which would have been lost to inefficiencies otherwise, and that might total greater than my personal work would.

Charitable fund raising is a tough, competitive industry and without evidence you shouldn't assume that you would be extremely successful at it. Keep in mind you would be competing against professional fund raisers who have massive institutional support. If you want to follow this path I suggest you try an experiment and see if you can get people you know to donate, say, $1000 to CFAR.

Also, I strongly suspect that CFAR would rather get 20k a year from you than receive your application as a fund raiser.

get people you know to donate

That may be harder than getting people on the Internet to donate, as in the latter case you have a much larger audience. Especially if you work on it full-time for one year with no other job.

If I were to do it professionally then you are correct on all fronts, but I was thinking of a far more informal approach.

Literally just talking to people, getting them to read the materials, and discussing the nuances of rational thought. The point here was supposed to be less about the money, and more about the value of producing more rational individuals. I don't necessarily have to convince people to donate to one charity over another, but by exposing them to the ideas of the site they would seek out such optimizations on their own.

I'll grant you that it is probably a minority of the population who would change, but the sum of the changes from people whose lives change dramatically could offset that 20k a year. For example, if I were to find one other person who then decides to donate 20k a year, I would have done as much good as if I were working. Even if I stopped talking to people entirely.

There is of course nothing stopping me from also doing this while working, but if it is actually more effective then if I were able to survive without working it would become the ideal. Realistically it ends up making a bit more sense to work for at least a decade while simultaneously doing this, build up enough cash to survive for a good long while, and then quit working and convert people with the rest of my life.

My impression was CEA was doing more outreach than CFAR and CFAR was focused more on improving the rationality, effectiveness, etc. of those already in the EA/rationality/etc. movement.

As far as I understand one of the core purposes of CFAR is to research how you can teach people to be more rational. If you want to raise the sanity line of the general population you do need that research.

SPARC is as far as I understand it about getting young kids to be rational.

I personally made the experience with Quantified Self movement building that while we did have a mainstream media presence most of the interesting people who came to our meetups didn't come over that channel.

I think that the level of immersion that SPARC provides is ideal for having a lasting impact.

My problem though is that if I were to instead convince others to do said donating, then I would overall push more dollars in that direction that the amount I personally could contribute by working. If that is indeed more effective, then more effective still would be to train advocates to convince people in my stead. I think it becomes a question on how much more effective. CFAR does look legit though. Thank you.

My problem though is that if I were to instead convince others to do said donating, then I would overall push more dollars in that direction that the amount I personally could contribute by working.

I think you underrate the difficulty of that task.

Probably.

I realized elsewhere my main pont wasn't so much about donating, but rather that by converting people to a more rational way of thinking would lead them to convince themselves to donate as an optimal strategy. Then again, if it really is most rational to improve the rationality of others, then that is what they would do rather than donating. (One also has to work enough to eat, but constraints are different for everyone so I am ignoring that)

There is an additional issue in that we are not totally rational, and also value other things than the reduction of human suffering.

Hi there, Regex. It is great to see that you're interested in Effective Altruism. Are you already familiar with 80,000 Hours and GiveWell?

There are a few points which you raised above which I would like to respond to:

Still, directing others is not particularly efficient for me considering my skill set.

I do think that it is possible to actively work towards becoming a better conversationalist. This is something which I have recently been working on myself. Specifically, asking people questions about things they are interested in, as well as offering to help others in small ways-- such as helping them by sharing knowledge with them-- seem like concrete things one can do to be a better people-person.

I intend to print out flyers and scatter them across my college campus.

That reminds me of when I tried to tell all of my friends about how great GiveWell was, when I first learned about the organization a few years ago. Your results may vary, but I have found that it is hard to get people to care about things which they do not already care about, unless they already have a lot in common with you, intellectually and philosophically.

I think that speaking with friends who might enjoy LessWrong might be a better idea than spreading fliers across your campus. I might only feel that way about flier-posting because flier-posting is far too bold of a thing for anypony named Fluttershy to do, but it does seem to me that messages spread by such means are cheapened by being circulated in such a manner.

Also, for what it is worth, I think that HPMoR appeals to a much larger set of people than the LessWrong sequences do, though people who identify closely enough with the "LessWrong culture" might benefit the most from being directly introduced to LessWrong.

Good luck in your endeavors!

(whoops can't figure out > quotes quite yet, so quotation marks will have to do.)

"Are you already familiar with 80,000 Hours and GiveWell?"

I had seen links to them, but hadn't gotten around to reading about them. I'll keep them in mind. Thanks.

"I do think that it is possible to actively work towards becoming a better conversationalist. "

Very true. When I learned about brain plasticity I was truly shocked at how much rewiring is possible. Did you know that you can lose either side of your brain as an infant and still live mostly normally? We can definitely improve ourselves significantly. The question there is which things when trained will provide the most utility.

"Your results may vary, but I have found that it is hard to get people to care about things which they do not already care about, unless they already have a lot in common with you, intellectually and philosophically."

I was planning on putting these in the engineering buildings mostly. I would argue that it is much easier to convince engineering types at least first to explore rationality, and then as they explore the site then they would have to deal with the money issues themselves.

"I think that speaking with friends who might enjoy LessWrong might be a better idea than spreading fliers across your campus."

I will agree that conversation is almost certainly the most effective route to convince a particular person, but many people aren't even aware than LW exists, so they might not even need convincing to explore. By putting it out there I catch people I don't even know, or couldn't possibly meet. This method has the disadvantage that I can't personally talk to them about it (maybe I could put my username on the flyer),

"I might only feel that way about flier-posting because flier-posting is far too bold of a thing for anypony named Fluttershy to do"

Silly pony.

"it does seem to me that messages spread by such means are cheapened by being circulated in such a manner."

To a certain degree yes, but not if the messages direct them to better manners of message circulation. The message I was thinking of was something along the lines of "Less Wrong: improve your critical thinking skills! Think smarter, not harder. Discuss ideas, philosophy, and rationality" which would hopefully be more as an invitation to a conversation with self-improvement than an introduction into a cult that wants their money. Maybe rather than a flyer per-say it could be a tiny little square piece of paper with sparse details inviting curiosity.

"Also, for what it is worth, I think that HPMoR appeals to a much larger set of people than the LessWrong sequences do, though people who identify closely enough with the "LessWrong culture" might benefit the most from being directly introduced to LessWrong."

I'm not really sure how much that is the case. HPMoR is like 500k words long, and is fanfiction. Readers of fanfiction don't care, but those that don't read it would be turned off. 500k words is nearly two full days days of straight reading for even me. I wouldn't call it particularly accessible either (haven't read it yet though. I've sworn off fanfiction for the moment after The Month of Fanfiction).

LW I would argue is significantly better, but has issues in that it is jargon filled and has kinda weird ideas. It has a tvtropes like effect on me at least. I read a thing which links to a dozen more things so I never leave. As I am focusing on the engineering types I should expect similar results. Probably should actually tell people about it and see what happens. You are right that a specific introduction is almost certainly more effective than a piece of paper floating around.

"Good luck in your endeavors!"

Thanks, I appreciate it.

We could also produce some kind of rationality book

EY's Sequences are currently being edited in ebook form.

That is a great thing to have! But still, if someone with good writing skills could make a more accessible version, that would be even better. But that needs a person with a specific talent, and is probably not something we could do as a group. (As a group, we could try to find such talented people who are also rational, given them the Sequences, and later ask if they would be interested in writing something similar.)

I'm a copyeditor and proofreader. That would be my dream job.

Any samples of your work? Cost estimate for the whole Sequences remake? A sample chapter?

Sometimes dreams come true... :)

This will blow up my online anonymity, but whatever.

In 2008, when I had no knowledge of Overcoming Bias and its related memespace, I wrote this.

That online newspaper no longer exists, but I'm glad they keep their old archives browseable.

I'm currently working with my literary agent to pitch publishers on a LW/rationality book. Our initial attempts didn't work because the publishers didn't know what shelf the book would go on in the book stores. Now we are thinking of doing a rationality book for managers.

That reminds me of Nassim Taleb who purposefully inserted a fictional chapter in (The Black Swan) to mess with the book stores ideas of how to categories books (chapter 2).

It would be interesting to know how many books got prevented by publishers not wanting to publish books that don't clearly fit.