AdeleneDawner comments on Whining-Based Communities - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (94)
Another great post, thanks Eliezer! But, if rationality is for you to win, shouldn't you try to keep it a secret from others? Like if you knew a way to make money in the stock market would you spread it if that nullified your advantage?
Winning isn't necessarily zero-sum.
Two things:
Regarding the second point: that's why Guilds were created, and they were quite powerful in their day. Why do you think they're called 'trade secrets'?
But modern technological civilisisation didn't take of until the guild system (keep it secret) was replaced by the patent system (publish it)
From Newcomb's Problem and Regret of Rationality (with emphasis added):
So yes, if for you winning means making money, and your best strategy to do that is to take advantage of irrationality in the stock market, then you will be motivated to keep your methods of rationality secret.
If, on the other hand, your utility function has a term for others, then you will want to teach them to be rational and win.
Eliezer is trying to win by creating a Friendly AI. If he gets more people to help him, this will help him win. If he spreads rationality, this will get more people to help him. Thus, he is spreading rationality to help him win.