RichardKennaway comments on Open Thread: March 4 - 10 - Less Wrong

3 Post author: Coscott 04 March 2014 03:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (391)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 05 March 2014 11:51:24AM *  5 points [-]

In the previous Open Thread NancyLebovitz posted an article about the living-Biblically-for-one-year guy deciding to try living one year rationally. Alicorn noticed that the article was from 2008, so the project was probably cancelled.

However, I was thinking... if someone tried to do this, what would be the best way to do it. (It's easy to imagine wrong ways: Hollywood rationality, etc.) We can assume that the person trying this experiment is not among the most rational people in the world, because they would already be too busy optimizing the universe, and wouldn't have a year of time to spend on such experiment. Also, they would probably already be living pretty rationally, so there would be no big change in their life, and therefore not an interesting report. (Although the participation in the experiment might create some extra incentive to behave rationally more consistently.) On the other hand, too irrational person would not be able to perform the task successfully. So, let's assume that the experimental person is... maybe an average LW reader, or someone generally LW-compatible who haven't found the website yet. (This also assumes that the LW model of rationality is approximately correct. Well, without this assumption it doesn't make much sense to discuss the best strategy here.)

So... let's suppose we have a volunteer who says: "I will try living the next year as rationally as possible, of course within my limits, so give me an advice about how to do it best. (In exchange I promise to keep logs, diaries, and publish the whole story, which could create some popularity for LW and CFAR.)" What advice would we give them?

A good meta-advice would be to keep a feedback loop with other aspiring rationalists. Not just take some initial advice, go away, return after one year with the report and risk getting a "you completely misunderstood it" reaction. Instead they should keep in contact; the question is merely how frequent and how detailed would the optimum contact be, to avoid wasting too much time in web discussions. I could imagine: asking specific difficult questions whenever necessary, and writing a detailed report every month, with the plans for the following months, so people on LW could comment on the strategy. Of course even this decision could be consulted on LW.

Now this feels a bit like cheating. Are we trying to test what one person can achieve during a year of living rationally, or are we using a LW hive-mind to optimize the person? In other words, would the results of the experiment speak about the benefits of rationality on one person, or about benefits of having a LW hive-mind available? Uhm... maybe there is actually no difference there? I mean, it is rational to use the best tools available. Virtue of scholarship, optimizing our social environment, munchkin attitude, etc. For a munchkin, there is no such thing as "cheating"; there is only more or less winning. -- But the important question is what is the goal of this experiment. Is it optimizing the one person's life? Or is it describing a strategy that dozens of other people may follow? Because if too many people decide to follow it, the LW hive-mind may be unable to provide a quality advice to all of them. On the other hand, such an event might motivate the LW hive-mind to become stronger and invent more efficient ways of supporting the aspiring rationalists. -- Uhm... I guess some forms of cheating should be prohibited. For example, if a poor person volunteers for the project, and some people from LW will send them money, and then they would rationalize it as winning by being rational even if the person does nothing else smart. ("What? In their situation it was rational to volunteer for the rationality experiment and ask people for money. It was a strategy that successfully increased their utility, and rationality by definition is winning.") On the other hand, if the person asks LW members for an expert advice in a domain they didn't study, I think that is completely fair; that is what they could (and perhaps should) have done even without the experiment. So, some kinds of support feel okay, other kinds feel not. Maybe the proper question is: Imagine that after successfully publishing the report, the next day 1000 more people would want to try using the same strategy. Would we feel that this contributed to our goal of raising the sanity waterline?

I also think that this kind of experiment would be fun, which is probably the main reason why I describe it; but as a side effect, if successful, it could be a great marketing material. What do you think? Is this "try one year of living most rationally with the support of LW hive-mind" experiment a good idea? Is anyone interested in being a volunteer? Are enough people interested in supporting them? (If yes, maybe we could launch the project on April 1st, the Fools' Day, because it's about all of us being less foolish, isn't it?)

Submitting...

Comment author: RichardKennaway 06 March 2014 01:49:42PM 1 point [-]

I think this is an excellent project, so excellent that I have to ask, why are we not (if indeed we are not) already doing this, all the time?

Weight-watcher groups are watching their weight all the time, not just meeting to talk about it. People meeting to help each other learn a foreign language are learning that language the rest of the time, at least, for as many hours a day as they find useful. University students make studying a full-time job (the ones that are serious). Rationality is supposed to be applicable to everything; every moment is an opportunity for practice.

Comment author: Viliam_Bur 06 March 2014 03:46:52PM 1 point [-]

You mentioned specific groups which try to reach a specific goal. That's great, on the level of individual goals. But we also need to go more meta. The foreign language group will not tell you to stop learning the language if your life situation changes so that the original purpose of learning the language is no longer valid; or if simply a better opportunity appears and it would be rational to move your limited resources from the language towards something else. Also, if you already haven't decided to study a specific language, the group will not find you and explore with you whether starting learning the language would be a good idea for you.

A rationalist group could help with this. We could already provide this support to each other; on meetups, on skype, on mailing lists. Some of us already use our good friends for this purpose; but the problem is that these friends are not always LW-style rationalists, so sometimes we only get their "cached thoughts" as an advice. Also, some people may use a psychologist; not necessarily as a source of rational advice, but as someone to listen and reflect on obvious irrationalities.

So, I think many of us are already using somewhat similar solutions, but either they were not consciously optimized, or they were optimized only for a partial goal.