Viliam_Bur comments on Open Thread: March 4 - 10 - Less Wrong

3 Post author: Coscott 04 March 2014 03:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (391)

You are viewing a single comment's thread. Show more comments above.

Comment author: Viliam_Bur 05 March 2014 11:51:24AM *  5 points [-]

In the previous Open Thread NancyLebovitz posted an article about the living-Biblically-for-one-year guy deciding to try living one year rationally. Alicorn noticed that the article was from 2008, so the project was probably cancelled.

However, I was thinking... if someone tried to do this, what would be the best way to do it. (It's easy to imagine wrong ways: Hollywood rationality, etc.) We can assume that the person trying this experiment is not among the most rational people in the world, because they would already be too busy optimizing the universe, and wouldn't have a year of time to spend on such experiment. Also, they would probably already be living pretty rationally, so there would be no big change in their life, and therefore not an interesting report. (Although the participation in the experiment might create some extra incentive to behave rationally more consistently.) On the other hand, too irrational person would not be able to perform the task successfully. So, let's assume that the experimental person is... maybe an average LW reader, or someone generally LW-compatible who haven't found the website yet. (This also assumes that the LW model of rationality is approximately correct. Well, without this assumption it doesn't make much sense to discuss the best strategy here.)

So... let's suppose we have a volunteer who says: "I will try living the next year as rationally as possible, of course within my limits, so give me an advice about how to do it best. (In exchange I promise to keep logs, diaries, and publish the whole story, which could create some popularity for LW and CFAR.)" What advice would we give them?

A good meta-advice would be to keep a feedback loop with other aspiring rationalists. Not just take some initial advice, go away, return after one year with the report and risk getting a "you completely misunderstood it" reaction. Instead they should keep in contact; the question is merely how frequent and how detailed would the optimum contact be, to avoid wasting too much time in web discussions. I could imagine: asking specific difficult questions whenever necessary, and writing a detailed report every month, with the plans for the following months, so people on LW could comment on the strategy. Of course even this decision could be consulted on LW.

Now this feels a bit like cheating. Are we trying to test what one person can achieve during a year of living rationally, or are we using a LW hive-mind to optimize the person? In other words, would the results of the experiment speak about the benefits of rationality on one person, or about benefits of having a LW hive-mind available? Uhm... maybe there is actually no difference there? I mean, it is rational to use the best tools available. Virtue of scholarship, optimizing our social environment, munchkin attitude, etc. For a munchkin, there is no such thing as "cheating"; there is only more or less winning. -- But the important question is what is the goal of this experiment. Is it optimizing the one person's life? Or is it describing a strategy that dozens of other people may follow? Because if too many people decide to follow it, the LW hive-mind may be unable to provide a quality advice to all of them. On the other hand, such an event might motivate the LW hive-mind to become stronger and invent more efficient ways of supporting the aspiring rationalists. -- Uhm... I guess some forms of cheating should be prohibited. For example, if a poor person volunteers for the project, and some people from LW will send them money, and then they would rationalize it as winning by being rational even if the person does nothing else smart. ("What? In their situation it was rational to volunteer for the rationality experiment and ask people for money. It was a strategy that successfully increased their utility, and rationality by definition is winning.") On the other hand, if the person asks LW members for an expert advice in a domain they didn't study, I think that is completely fair; that is what they could (and perhaps should) have done even without the experiment. So, some kinds of support feel okay, other kinds feel not. Maybe the proper question is: Imagine that after successfully publishing the report, the next day 1000 more people would want to try using the same strategy. Would we feel that this contributed to our goal of raising the sanity waterline?

I also think that this kind of experiment would be fun, which is probably the main reason why I describe it; but as a side effect, if successful, it could be a great marketing material. What do you think? Is this "try one year of living most rationally with the support of LW hive-mind" experiment a good idea? Is anyone interested in being a volunteer? Are enough people interested in supporting them? (If yes, maybe we could launch the project on April 1st, the Fools' Day, because it's about all of us being less foolish, isn't it?)

Submitting...

Comment author: ChristianKl 06 March 2014 09:36:27AM 0 points [-]

I think most Christians would say that Jabocs completely misunderstood Christianity.

I think that experiments like this which take ideas very seriously are good because they give us an additional perspective of what rationality happens to be.

Comment author: Viliam_Bur 06 March 2014 11:32:36AM 4 points [-]

I think most Christians would say that Jabocs completely misunderstood Christianity.

Jacobs' Biblical behavior : Christianity = Hollywood rationality : LW rationality

He cheated by approximating the outside behavior, while preserving his inside behavior (thoughts and beliefs) as much as possible. When the year was over, he probably reverted back to normal. That kind of experiment is only good for examining a strawman. And also... for publicity.

I believe that in this community it is completely obvious that we are not trying to perform the Hollywood rationality. However, there is still a risk that our understanding is imperfect, and taking ideas seriously will expose the imperfections. For example, we may publicly profess that emotions are important, and yet our "rational" plans may fail to consider them. But this is where we need to use our ability to go meta and think: "okay, this plan sounded completely reasonable, but now that I am doing it for two months, I feel somehow unhappy and unmotivated", so we try to update the plan, instead of merely (a) blindly following it, or (b) giving up completely.

Comment author: ChristianKl 06 March 2014 11:57:34AM 1 point [-]

One of the best example that I have for a rational plan is my attempt to gain weight by adding 800 kcal of maltodextrose to my daily tea consumption. It made so much sense.

On the other hand it didn't work and it took me 2 months to admit that my scale showed still the same weight. The planes didn't land.

However, there is still a risk that our understanding is imperfect, and taking ideas seriously will expose the imperfections.

I think it's pretty certain that our understanding isn't 100% perfect. We can run controlled trials to update our understanding of rationality and as far as I understand CFAR wants to does go that way.

Taking ideas overseriously is another way to see imperfections and gather knowledge. I think that when one tries to gather knowledge about a domain it's useful to use many different approaches to gather knowledge.

Comment author: Fossegrimen 06 March 2014 04:36:04PM 0 points [-]

Why were you trying to gain weight and is it still a goal?

I deliberately adjust my weight up or down by ~ 10kg fairly regularly and depending on your situation, I might be able to offer some ideas.

Comment author: ChristianKl 06 March 2014 04:45:57PM 0 points [-]

Yes I still have that goal. I'm 181 cm tall and at 56 kg with +-2 kg for the last 3 years. Probably also the last ten but only in the last 3 I had regular measurements.