Irrationality Game: Less Wrong is simply my Tyler Durden—a disassociated digital personality concocted by my unconcious mind to be everything I need it to be to cope with Camusian absurdist reality. 95%.
is quite safe if done properly
That's the thing -- it's basically an issue of idiot-proofing. Many things are "safe if done properly" and still are not a good idea because people in general are pretty bad at doing things properly.
Flush toilets are idiot-proof to a remarkable degree. Composting human manure, I have my doubts.
Irrationality game: Every thing which exists has subjective experience (80%). This includes things such as animals, plants, rocks, ideas, mathematics, the universe and any sub component of an aforementioned system.
Irrationality game:
Most posthuman societies will have violent death rate much higher than humans ever had. Most poshumans who will ever live will die at wars. 95%
Interesting. So, you have Robin Hanson's belief that we won't get a strong singleton; but you lack his belief that emulated minds will be able to evaluate each other's abilities with enough confidence that trade (taking into account the expected value of fighting) will be superior to fighting? That's quite the idiosyncratic position, especially for 95% confidence.
You (the reader) do not exist.
EDIT: That was too punchy and not precise. The reasoning behind the statement:
Most things which think they are me are horribly confused gasps of consciousness. Rational agents should believe the chances are small that their experiences are remotely genuine.
EDIT 2: After thinking about shminux's comment, I have to retract my original statement about you readers not existing. Even if I'm a hopelessly confused Boltzmann brain, the referent "you" might still well exist. At minimum I have to think about existence more. Sorry!
Irrationality Game: I am something ontologically distinct from my body; I am much simpler and I am not located in the same spacetime. 50%
EDIT: Upon further reflection, my probability assignment would be better represented as the range between 30% and 50%, after factoring in general uncertainty due to confusion. I doubt this will make a difference to the voting though. ;)
Irrationality game - there is a provident, superior entity that is in no way infinite (I wonder if people here would call that God. As a "superman theist" I had to put "odds of God (as defined in question)" at 5% but identify as strongly theist in the last census)
Edit: forgot odds. 80%
The universe is finite, and not much bigger than the region we observe. There is no multiverse (in particular Many Worlds Interpretation is incorrect and SIA is incorrect). There have been a few (< million) intelligent civilisations before human beings but none of them managed to expand into space, which explains Fermi's paradox. This also implies a mild form of the "Doomsday" argument (we are fairly unlikely to expand ourselves) but not a strong future filter (if instead millions or billions of civilisations had existed, but none of them expanded, there would be a massive future filter). Probability: 90%.
Irrationality Game: One can reliable and predictably make $1M / year, and it's not that difficult. (Confidence: 75%)
Irrationality game:
There are other 'technological civilizations' (in the sense of intelligent living things that have learned to manipulate matter in a complicated way) in the observable universe: 99%
There are other 'technological civilizations' in our own galaxy: 75% with most of the probability mass in regimes where there are somewhere between dozens and thousands.
Conditional on these existing: Despite some being very old, they are limited by the hostile nature of the universe and the realities of practical manipulation of matter and energy to never co...
Irrationality game:
Nice idea. This way I can safely test whether the Baseline of my opinion on LW topics is as contrarian as I think.
My proposition:
On The Simulation Argument I go for "(1) the human species is very likely to go extinct before reaching a “posthuman” stage" (80%)
Correspondingly on The Great Filter I go for failure to reach "9. Colonization explosion" (80%).
This is not because I think that humanity is going to self-annihilate soon (though this is a possibility).
Irrationality game: people are happier when living in traditional social structures, and value being part of their traditions[1]. The public existence of "weird" relationships (homosexuality, polyamory, BDSM, ...) is actively harmful to most people; the open practice of them is a net negative for world utility. Morally good actions include condemnation and censorship of such things.
[1] Or rather what they believe are their traditions; these beliefs may not be particularly well-correlated with reality.
Irrationality Game: Currently, understanding history or politics is a better avenue than studying AI or decision theory for dealing with existential risk. This is not because of the risk of total nuclear annihilation, but because of the possibility of political changes that result in setbacks to or an accelerated use and understanding of AI. 70%
I'm 99% confident that dust specks in 3^^^3 eyes result in less lost utility than 50 years of torturing one person.
Just as a curiosity, this was the most downvoted comment in the original thread:
For a large majority of people who read this, learning a lot about how to interact with other human beings genuinely and in a way that inspires comfort and pleasure on both sides is of higher utility than learning a lot about either AI or IA. ~90%
(-44 points)
Irrationality game:
Most progress in medicine in the next 50 years won't be due to advances in molecular biology and the production of drugs that are designed to target specific biochemical pathways but through other paradigms.
Probability: 75%
Irrationality game: The straightforward view of the nature of the universe is fundamentally flawed. 90%
By "fundamentally flawed", I mean things like:
Irrationality game: The Great Stagnation is actually occurring, and it is mostly due to fossil fuel depletion rather than (say) leftist politics or dysgenics. (60%)
Irrationality game: most opposition to wireheading comes from seeing it as weird and/or counterintuitive in the same way that most non-LWers see cryonics/immortalism as weird. Claiming to have multiple terminal values is an attempt to justify this aversion. 75%
Irrationality Game: We need a way to give feedback on irrationality game entries that the troll toll won't mess with. (98%)
[pollid:643]
Irrationality Game:
Everyone alive in developed nations today will die a fairly standard biological death by age:
150: 75%
250: 95%
(This latter figure accounts for the possibility that the stories of the odd Chinese monk living to age 200+ after only eating wild herbs from age 10 on up is actually true and not an exaggeration, or someone sticking to unreasonably-effective calorie restriction regimes religiously combined with some interesting metabolic rejiggering in the coming decade or two).
The majority (90+%) of people born in developed nations today will die a fairly standard biological death by age:
120: 85%
150: 99%
Irrationality Game:
Politics (in particular, large governments such as the US, China, and Russia) are a major threat to the development of friendly AI. Conditional on FAI progress having stopped, I give a 60% chance that it was because of government interference, rather than existential risk or some other problem.
If the universe were to survive 280 billion years, then that would put us within the first 5% of the universe's lifespan. So, if we take an alpha of 5%, we can reject the hypothesis that the universe will last more than 280 billion years.
That sounds like "Copernican" reasoning (assume you are at a random point in time) rather than "anthropic" reasoning (assume you are a random observer from a class of observers). I'm not surprised the Copernican approach gives daft results, because the spatial version (assume you are at a random point in space) also gives daft results: see here in this thread point 2.
Incidentally, there is a valid anthropic version of your argument: the prediction is that the universe will be uninhabitable 280 billion years from now, or at least contain many fewer observers than it does now. However, in that case, it looks like a successful prediction. The recent discovery that the stars are beginning to go out and that 95% of stars that will ever form have formed already is just the sort of thing that would be expected under anthropic reasoning. But it is totally surprising otherwise.
We can also reject the hypothesis that more than 4 trillion humans lives will take place
The correct application of anthropic reasoning only rejects this as a hypothesis about the average number of observers in a civilisation, not about human beings specifically. If we knew somehow (on other grounds) that most civilisations make it to 10 trillion observers, we wouldn't predict any less for human beings.
that any given 1-year-old will reach the age of 20,
That's an instance of the same error: anthropic reasoning does NOT reject the particular hypothesis. We already know that an average human lifespan is greater than 20, so we have no reason to predict less than 20 for a particular child. (The reason is that observing one particular child at age 1 as a random observation from the set of all human observations is no less probable if she lives to 100 than if she lives to 2).
The probability that the right sperm would fertilize the right egg and I would be conceived is much less than 1 in a billion, but that doesn't mean I think I need a new model
Anthropic reasoning is like any Bayesian reasoning: observations only count as evidence between hypotheses if they are more likely on one hypothesis than another. Also, hypotheses must be fairly likely a priori to be worth considering against the evidence. Suppose you somehow got a precise observation of sperm meeting egg to make you, with a genome analysis of the two: that exact DNA readout would be extremely unlikely under the hypothesis of the usual laws of physics, chemistry and biology. But that shouldn't make you suspect an alternative hypothesis (e.g. that you are some weird biological experiment, or a special child of god) because that exact DNA readout is extremely unlikely on those hypotheses as well. So it doesn't count as evidence for these alternatives.
The probability of being born prior to a galactic-wide expansion may be very low, but someone has to be born before the expansion. What's so special about me, that I should reject the possibility that I such a person?
If all hypotheses gave extremely low probability of being born before the expansion, then you are correct. But the issue is that some hypotheses give high probability that an observer finds himself before expansion (the hypotheses where no civilisations expand, and all stay small). So your observations do count as evidence to decide between the hypotheses.
The 'Irrationality Game' posts in discussion came before my time here, but I had a very good time reading the bits written in the comments section. I also had a number of thoughts I would've liked to post and get feedback on, but I knew that being buried in such old threads not much would come of it. So I asked around and feedback from people has suggested that they would be open to a reboot!
I hereby again quote the original rules:
I would suggest placing *related* propositions in the same comment, but wildly different ones might deserve separate comments for keeping threads separate.
Make sure you put "Irrationality Game" as the first two words of a post containing a proposition to be voted upon in the game's format.
Here we go!
EDIT: It was pointed out in the meta-thread below that this could be done with polls rather than karma so as to discourage playing-to-win and getting around the hiding of downvoted comments. If anyone resurrects this game in the future, please do so under that system If you wish to test a poll format in this thread feel free to do so, but continue voting as normal for those that are not in poll format.