AdeleneDawner comments on Whining-Based Communities - Less Wrong

59 Post author: Eliezer_Yudkowsky 07 April 2009 08:31PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

You are viewing a single comment's thread.

Comment author: roland 08 April 2009 05:36:26AM 1 point [-]

Another great post, thanks Eliezer! But, if rationality is for you to win, shouldn't you try to keep it a secret from others? Like if you knew a way to make money in the stock market would you spread it if that nullified your advantage?

Comment author: loqi 08 April 2009 06:12:06AM 10 points [-]

Winning isn't necessarily zero-sum.

Comment author: Vladimir_Nesov 08 April 2009 09:13:16AM *  6 points [-]

Two things:

  • Advantage over others is not the only thing people care about.
  • The "rationality" developed in secret is unlikely to grow more powerful than whatever technology a single farmer from Dark Ages could develop in a lifetime, that is to say not impressive at all.
Comment author: Annoyance 08 April 2009 03:42:01PM 3 points [-]

Regarding the second point: that's why Guilds were created, and they were quite powerful in their day. Why do you think they're called 'trade secrets'?

Comment author: Peterdjones 08 June 2011 11:30:23AM 1 point [-]

But modern technological civilisisation didn't take of until the guild system (keep it secret) was replaced by the patent system (publish it)

Comment author: JGWeissman 08 April 2009 06:43:25AM 5 points [-]

From Newcomb's Problem and Regret of Rationality (with emphasis added):

Don't mistake me, and think that I'm talking about the Hollywood Rationality stereotype that rationalists should be selfish or shortsighted. If your utility function has a term in it for others, then win their happiness. If your utility function has a term in it for a million years hence, then win the eon.

So yes, if for you winning means making money, and your best strategy to do that is to take advantage of irrationality in the stock market, then you will be motivated to keep your methods of rationality secret.

If, on the other hand, your utility function has a term for others, then you will want to teach them to be rational and win.

Comment author: DanielLC 14 April 2013 03:37:47AM 2 points [-]

Eliezer is trying to win by creating a Friendly AI. If he gets more people to help him, this will help him win. If he spreads rationality, this will get more people to help him. Thus, he is spreading rationality to help him win.