The 'Irrationality Game' posts in discussion came before my time here, but I had a very good time reading the bits written in the comments section. I also had a number of thoughts I would've liked to post and get feedback on, but I knew that being buried in such old threads not much would come of it. So I asked around and feedback from people has suggested that they would be open to a reboot!
I hereby again quote the original rules:
Please read the post before voting on the comments, as this is a game where voting works differently.
Warning: the comments section of this post will look odd. The most reasonable comments will have lots of negative karma. Do not be alarmed, it's all part of the plan. In order to participate in this game you should disable any viewing threshold for negatively voted comments.
Here's an irrationalist game meant to quickly collect a pool of controversial ideas for people to debate and assess. It kinda relies on people being honest and not being nitpickers, but it might be fun.
Write a comment reply to this post describing a belief you think has a reasonable chance of being true relative to the the beliefs of other Less Wrong folk. Jot down a proposition and a rough probability estimate or qualitative description, like 'fairly confident'.
Example (not my true belief): "The U.S. government was directly responsible for financing the September 11th terrorist attacks. Very confident. (~95%)."
If you post a belief, you have to vote on the beliefs of all other comments. Voting works like this: if you basically agree with the comment, vote the comment down. If you basically disagree with the comment, vote the comment up. What 'basically' means here is intuitive; instead of using a precise mathy scoring system, just make a guess. In my view, if their stated probability is 99.9% and your degree of belief is 90%, that merits an upvote: it's a pretty big difference of opinion. If they're at 99.9% and you're at 99.5%, it could go either way. If you're genuinely unsure whether or not you basically agree with them, you can pass on voting (but try not to). Vote up if you think they are either overconfident or underconfident in their belief: any disagreement is valid disagreement.
That's the spirit of the game, but some more qualifications and rules follow.
If the proposition in a comment isn't incredibly precise, use your best interpretation. If you really have to pick nits for whatever reason, say so in a comment reply.
The more upvotes you get, the more irrational Less Wrong perceives your belief to be. Which means that if you have a large amount of Less Wrong karma and can still get lots of upvotes on your crazy beliefs then you will get lots of smart people to take your weird ideas a little more seriously.
Some poor soul is going to come along and post "I believe in God". Don't pick nits and say "Well in a a Tegmark multiverse there is definitely a universe exactly like ours where some sort of god rules over us..." and downvote it. That's cheating. You better upvote the guy. For just this post, get over your desire to upvote rationality. For this game, we reward perceived irrationality.
Try to be precise in your propositions. Saying "I believe in God. 99% sure." isn't informative because we don't quite know which God you're talking about. A deist god? The Christian God? Jewish?
Y'all know this already, but just a reminder: preferences ain't beliefs. Downvote preferences disguised as beliefs. Beliefs that include the word "should" are are almost always imprecise: avoid them.
That means our local theists are probably gonna get a lot of upvotes. Can you beat them with your confident but perceived-by-LW-as-irrational beliefs? It's a challenge!Additional rules:
- Generally, no repeating an altered version of a proposition already in the comments unless it's different in an interesting and important way. Use your judgement.
- If you have comments about the game, please reply to my comment below about meta discussion, not to the post itself. Only propositions to be judged for the game should be direct comments to this post.
- Don't post propositions as comment replies to other comments. That'll make it disorganized.
- You have to actually think your degree of belief is rational. You should already have taken the fact that most people would disagree with you into account and updated on that information. That means that any proposition you make is a proposition that you think you are personally more rational about than the Less Wrong average. This could be good or bad. Lots of upvotes means lots of people disagree with you. That's generally bad. Lots of downvotes means you're probably right. That's good, but this is a game where perceived irrationality wins you karma. The game is only fun if you're trying to be completely honest in your stated beliefs. Don't post something crazy and expect to get karma. Don't exaggerate your beliefs. Play fair.
- Debate and discussion is great, but keep it civil. Linking to the Sequences is barely civil -- summarize arguments from specific LW posts and maybe link, but don't tell someone to go read something. If someone says they believe in God with 100% probability and you don't want to take the time to give a brief but substantive counterargument, don't comment at all. We're inviting people to share beliefs we think are irrational; don't be mean about their responses.
- No propositions that people are unlikely to have an opinion about, like "Yesterday I wore black socks. ~80%" or "Antipope Christopher would have been a good leader in his latter days had he not been dethroned by Pope Sergius III. ~30%." The goal is to be controversial and interesting.
- Multiple propositions are fine, so long as they're moderately interesting.
- You are encouraged to reply to comments with your own probability estimates, but comment voting works normally for comment replies to other comments. That is, upvote for good discussion, not agreement or disagreement.
- In general, just keep within the spirit of the game: we're celebrating LW-contrarian beliefs for a change!
I would suggest placing *related* propositions in the same comment, but wildly different ones might deserve separate comments for keeping threads separate.
Make sure you put "Irrationality Game" as the first two words of a post containing a proposition to be voted upon in the game's format.
Here we go!
EDIT: It was pointed out in the meta-thread below that this could be done with polls rather than karma so as to discourage playing-to-win and getting around the hiding of downvoted comments. If anyone resurrects this game in the future, please do so under that system If you wish to test a poll format in this thread feel free to do so, but continue voting as normal for those that are not in poll format.
My disagreement is that the anthropic reasoning you use is not a good argument for non-existence of large civilizations.
I am using a future light cone whereas your alternatives seem to be formulated in terms of a past light cone. Let me say that I think the probability to ever encounter another civilization is related to the ratio {asymptotic value of Hubble time} / {time since appearance of civilizations became possible}. I can't find the numbers this second, but my feeling is such an occurrence is far from certain.
Very good point! I think that if the "computronium universe" is not suppressed by some huge factor due to some sort of physical limit / great filter, then there is a significant probability such a universe arises from post-human civilization (e.g. due to FAI). All decisions with possible (even small) impact on the likelihood of and/or the properties of this future get a huge utility boost. Therefore I think decisions with long term impact should be made as if we are not in a simulation whereas decisions which involve purely short term optimizations should be made as if we are in a simulation (although I find it hard to imagine such a decision in which it is important whether we are in a simulation).
The effective time discount function is of rather slow decay because the sum over universes includes time translated versions of the same universe. As a result, the effective discount falls off as 2^{-Kolmogorov complexity of t} which is only slightly faster than 1/t. Nevertheless, for huge time differences your argument is correct. This is actually a good thing, since otherwise your decisions would be dominated by the Boltzmann brains appearing far after heat death.
It is about 1/t x 1/log t x 1/log log t etc. for most values of t (taking base 2 logarithms). There are exceptions for very regular values of t.
Incidentally, I've been thinking about a similar weighting approach towards anthropic reasoning, and it seems to avoid a strong form of the Doomsday Argument (one where we bet heavily against our civilisation expanding). Imagine listing all the observers (or observer moments) in order of appea... (read more)