IRRATIONALITY GAME
Eliezer Yudovsky has access to a basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote.
Probability: improbable ( 2% )
If such a universal basilisk exists, wouldn't it almost by definition kill the person who discovered it?
I think it's vaguely plausible such a basilisk exists, but I also think you are suffering from the halo effect around EY. Why would he of all people know about the basilisk? He's just some blogger you read who says things as though they are Deep Wisdom so people will pay attention.
Wouldn't the world be observably different if everyone of EY's intellectual ability or above had access to a basilisk kill agent? And wouldn't we expect a rash of inexplicable deaths in people who are capable of constructing a basilisk but not vaccinating themselves?
Are basilisks necessarily fatal? If the majority of basilisks caused insanity or the loss of intellectual capacity instead of death, I would expect to see a large group of people who considered themselves capable of constructing basilisks, but who on inspection turned out to be crazy or not nearly that bright after all.
...
Oh, shit.
Not necessarily. If I did, in fact, possess such a basilisk, I cannot think offhand of any occasion where I would have actually used it. Robert Mugabe doesn't read my emails, it's not clear that killing him saves Zimbabwe, I have ethical inhibitions that I consider to exist for good reasons, and have you thought about what happens if somebody else glances at the computer screen afterward, and resulting events lead to many agents/groups possessing a basilisk?
2% is way way way WAY too high for something like that. You shouldn't be afraid to assign a probability much closer to 0.
2% is too high a credence for belief in the existence of powers for which (as far as I know) not even anecdotal evidence exists. It's the realm of speculative fiction, well beyond the current ability of psychological and cognitive science and, one imagines, rather difficult to control.
But ascribing such a power to a specific individual who hasn't had any special connection to cutting edge brain science or DARPA and isn't even especially good at using conventional psychological weapons like 'charm' is what sends your entry into the realm of utter and astonishing absurdity.
Point, it's not a strategy for arriving at truths, it's a snappy comeback at a failure mode I'm getting really tired of. The fact that something is in the realm of speculative fiction is not a valid argument in a world full of cyborgs, tablet computers, self driving cars, and casualty-defying decision theories. And yes, basilisks.
The argument isn't that because something is found in speculative fiction it can't be real; it's that this thing you're talking about isn't found outside of speculative fiction-- i.e. it's not real. Science can't do that yet. If you're familiar with the state of a science you have a good sense of what is and isn't possible yet. "A basilisk kill agent that allows him to with a few clicks untraceably assassinate any person he can get to read a short email or equivalent, with comparable efficiency to what is shown in Deathnote" is very likely one of those things. I mention "speculative fiction" because a lot of people have a tendency to privilege hypotheses they find in such fiction.
Hypnotism is not the same as what you're talking about. The Roko 'basilisk' is joke compared to what you're describing. None of these are anecdotal evidence for the power you are describing.
Irrationality Game
If we are in a simulation, a game, a "planetarium", or some other form of environment controlled by transhuman powers, then 2012 may be the planned end of the game, or end of this stage of the game, foreshadowed within the game by the Mayan calendar, and having something to do with the Voyager space probe reaching the limits of the planetarium-enclosure, the galactic center lighting up as a gas cloud falls in 30,000 years ago, or the discovery of the higgs boson.
Since we have to give probabilities, I'll say 10%, but note well, I'm not saying there is a 10% probability that the world ends this year, I'm saying 10% conditional on us being in a transhumanly controlled environment; e.g., that if we are in a simulation, then 2012 has a good chance of being a preprogrammed date with destiny.
Irrationality Game
For reasons related to Godel's incompleteness theorems and mathematically proven minimum difficulties for certain algorithms, I believe there is an upper limit on how intelligent an agent can be. (90%)
I believe that human hardware can - in principle - be as intelligent as it is possible to be. (60%) To be clear, this doesn't actually occur in the real world we currently live in. I consider the putatively irrational assertion roughly isomorphic to asserting that AGI won't go FOOM.
If you voted already, you might not want to vote again.
Particularly with a generational time of 15-25 years, and with the fact that evolution basically stopped working as an enhancer once humans passed the threshold of preventing most premature deaths (where premature just means before the end of the reproductive window).
This is way off for almost all of human history almost everywhere. See the work of Greg Clark: occupational success and wealth in pre-industrial Britain were strongly correlated with the number of surviving children, as measured by public records of birth, death, and estates. Here's an essay by Ron Unz discussing similar patterns in China. Or look at page 12 of Greg Cochran and company's paper on the evolutionary history of Ashkenazi intelligence. Over the last 10,000 years evolutionary selective sweeps have actually greatly accelerated in the course of adapting to agricultural and civilized life.
How did intelligence, or earnings affected by intelligence, get converted into more surviving children?
Computationalism is an incorrect model of cognition. Brains compute, but mind is not what the brain does. There is no self hiding inside your apesuit. You are the apesuit. Minds are embodied and extended, and a major reason why the research program to build synthetic intelligences has largely gone nowhere since its inception is the failure of many researchers to understand/agree with this idea.
70%
Irrationality Game
I believe that exposure to rationality (in the LW sense) at today's state does in general more harm than good^ to someone who's already a skeptic. 80%
^ In the sense of generating less happiness and in general less "winning".
I'll bite:
The U.S. government deliberately provoked the attack on Pearl Harbour through diplomacy and/or fleet redeployment, and it was not by chance that the carriers of the U.S. Pacific Fleet weren't at port when the attack happened.
Very confident. (90-95%)
By the way, the reason I assume I am personally more rational about this than the LW average is that there are lots of US Americans around here, and I have sufficient evidence to believe that people tend to become less rational if a topic centrally involves a country they are emotionally involved with or whose educational system they went through.
I don't have a lot of strong reasons to disbelieve you, but what evidence makes you think this is so?
Regarding the first part, the truth of that statement critically depends on how exactly you define "provoke." For some reasonable definitions, the statement is almost certainly true; for others, probably not.
As for the second part (the supposed intentional dispersion of the carriers), I don't think that's plausible. If anything, the U.S. would have been in a similar position, i.e. at war with Japan with guaranteed victory, even if every single ship under the U.S. flag magically got sunk on December 7, 1941. So even if there was a real conspiracy involved, it would have made no sense to add this large and risky element to it just to make the eventual victory somewhat quicker.
Also, your heuristic about bias is broken. In the Western world outside of the U.S., people are on average, if anything, only more inclined to believe the official historical narrative about WW2.
The "and it was not chance" bit? That requires the conspirators be non-human.
Carrier supremacy was hardly an established doctrine, much less proved in battle; orthodox belief since Mahan was that battleships were the most important ships in a fleet. The orthodox method of preserving the US Navy's power would have been to disperse battleships, not carriers. Even if the conspirators were all believers in the importance of carriers, even a minimum of caution would have led them to find an excuse to also save some of the battleships. To believe at 90% confidence that a group of senior naval officials, while engaging in a high-stakes conspiracy, also took a huge un-hedged gamble on an idea that directly contradicted the established naval dogma they were steeped in since they were midshipmen, is ludicrous.
Irrationality Game
Being a materialist doesn't exclude nearly as much of the magical, religious, and anomalous as most materialists believe because matter/energy is much weirder than is currently scientifically accepted.
75% certainty.
irrationality game: The universe is, due to some non-reducible (i.e. non-physical) entity, indeterministic. 95% That entity is the human mind (not brain). 90%
Irrationality Game:
These claims assume MWI is true.
Claim #1: Given that MWI is true, a sentient individual will be subjectively immortal. This is motivated by the idea that branches in which death occurs can be ignored and that there are always enough branches for some form of subjective consciousness to continue.
Claim #2: The vast majority of the long-term states a person will experience will be so radically different than the normal human experience that they are akin to perpetual torture.
P(Claim #1) = 60%
P(Claim #2 | Claim #1) = 99%
Irrationality game
Money does buy happiness. In general the rich and powerful are in fact ridiculously happy to an extent we can't imagine. The hedonic treadmill and similar theories are just a product of motivated cognition, and the wealthy and powerful have no incentive to tell us otherwise. 30%
Irrationality game comment:
Imagine that we transformed the Universe using some elegant mathematical mapping (think about Fourier transform of the phase space) or that we were able to see the world through different quantum observables than we have today (seeing the world primarily in the momentum space, or even being able to experience "collapses" to eigenvectiors not of x or p, but of a different, for us unobservable, operator, e.g. xp). Then, we would observe complex structures, perhaps with their own evolution and life and intelligence. That is, aliens can be all around us but remain as invisible as Mona Lisa on a Fourier transformed picture from Louvre.
Probability : 15%.
Irrationality Game
It's possible to construct a relatively simple algorithm to distinguish superstimulatory / acrasiatic media from novel, educational or insightful content. Such an algorithm need not make use of probabilistic classifiers or machine-learning techniques that rely on my own personal tastes. The distinction can be made based on testable, objective properties of the material. (~20%)
(This is a bit esoteric. I am starting to think up aggressive tactics to curb my time-wasteful internet habits, and was idly fantasising about a browser plugin that would tell me whether the link I was about to follow was entertaining glurge or potentially valuable. In wondering how that would work, I started thinking about how I classify it. My first thought would be that it's a subjective judgement call, and a naive acid-test that distinguished the two was tantamount to magic. After thinking about it for a little longer, I've started to develop some modestly-weighted fuzzy intuitions that there is some objective property I use to classify them, and that this may map faithfully onto how other people classify them.)
Irrationality Game
Aaron Swartz did not actually commit suicide. (10%)
(Hat tip to Quirinus Quirrell, whoever that actually is.)
An alien civilization within the boundaries of the current observable universe has, or will have within the next 10 billion years, created a work of art which includes something directly analogous to the structure of the "dawn motif" from the beginning of Richard Strauss's Also sprach Zarathustra. (~90%)
The case for atheistic reductionism is not a slam-dunk.
While atheistic reductionism is clearly simpler than any of the competing hypotheses, each added bit of complexity doubles the size of hypothesis space. Some of these additional hypotheses will be ruled out due to impossibility or inconsistency with observation, but that still leaves a huge number of possible hypotheses that each add take up a tiny amount of probability mass, but they add up.
I would give atheistic reductionism a ~30% probability of being true. (I would still assign specific human religions or a specific simulation scenario approximately zero probability.)
There is no dark matter. Gravity behaves weirdly for some other reason we haven't discovered yet. (85%)
Irrationality Game
Prediction markets are a terrible way of aggregating probability estimates. They only enjoy the popularity they do because of a lack of competition, and because they're cheaper to set up due to the built-in incentive to participate. They do slightly worse than simply averaging a bunch of estimates, and would be blown out of the water by even a naive histocratic algorithm (weighted average based on past predictor performance using Bayes). The performance problems of prediction markets are not just due to liquidity issues, but would inevi...
They do slightly worse than simply averaging a bunch of estimates, and would be blown out of the water by even a naive histocratic algorithm (weighted average based on past predictor performance using Bayes)
Fantastic. Please tell me which markets this applies to and link to the source of the algorithm that gives me all the free money.
And the amount of money to be made is small in any event because there's just not enough participation in the markets.
Aren't prediction markets just a special case of financial markets? (Or vice versa.) Then if your algorithm could outperform prediction markets, it could also outperform the financial ones, where there is lots of money to be made.
In prediction markets, you are betting money on your probability estimates of various things X happening. On financial markets, you are betting money on your probability estimates of the same things X, plus your estimate of the effect of X on the prices of various stocks or commodities.
The conventional reply is that noise traders improve markets by making rational prediction more profitable. This is almost certainly true for short-term noise, and my guess is that it's false for long-term noise, i.e., if prices revert in a day, noise traders improve a market, if prices take ten years to revert, the rational money seeks shorter-term gains. Prediction markets may be expected to do better because they have a definite, known date on which the dumb money loses - you can stay solvent longer than the market stays irrational.
Irrationality game:
Humanity has already recieved and recorded a radio message from another technological civilization. This was unconfirmed/unnoticed due to being very short and unrepeated, or mistaken for a transient terrestrial signal, or modulated in ways we were not looking for, or was otherwise overlooked. 25%.
What are the rules on multiple postings? I have a cluster of related (to each other, not this) ones I would love to post as a group.
Irrationality game comment
The importance of waste heat in the brain is generally under-appreciated. An overheated brain is a major source of mental exhaustion, akrasia, and brain fog. One easy way to increase the amount of practical intelligence we can bring to bear on complicated tasks (with or without an accompanying increase in IQ itself) is to improving cooling in the brain. This would be most effective with some kind of surgical cooling system thingy, but even simple things like being in a cold room could help
Confidence: 30%
Multiple systems are correct about their experiences. In particular, killing a N-person system is as bad as killing N singlets. (90%)
Irrationality Game:
I believe Plato (and others) were right when they said music develops some form of sensibility, some sort of compassion. I posit a link between the capacity of understanding music and understanding other people by creating accurate images of them in our head, and of how they feel. 80%
Irrationality Game:
The Occam argument against theism, in the forms typically used in LW invoking Kolmogorov complexity or equivalent notions, is a lousy argument: its premises and conclusions are not incorrect, but it is question-begging to the point that no intellectually sophisticated theist should move their credence significantly by it. 75%.
(It is difficult to attach meaningfully a probability to this kind of claim, which is not about hard facts. I guesstimated that in an ideally open-minded and reasoned philosophical discussion, there wold be a 25% chance of me being persuaded of the contrary.)
Irrationality game
I have a suspicion that some form of moral particularism is the most sensible moral theory. 10% confidence.
...Moral particularism is the view that there are no moral principles and that moral judgement can be found only as one decides particular cases, either real or imagined. This stands in stark contrast to other prominent moral theories, such as deontology or utilitarianism. In the former, it is asserted that people have a set of duties (that are to be considered or respected); in the latter, people are to respect the happiness or the p
It is plausible that an existing species of dolphin or whale possesses symbolic language and oral culture at least on par with that of neolithic-era humanity. (75%)
Is "it is plausible" part of the statement to which you give 75% credence, or is it another way of putting said credence?
Because cetacean-language is more than 75% likely to be plausible but I think less than 75% likely to be true.
I proposed a variation on this game, optimized for usefulness instead of novelty: the "maximal update game". Start with a one sentence summary of your conclusion, then justify it. Vote up or down the submissions of others based on the degree to which you update on the one sentence summary of the person's conclusion. (Hence no UFOs at the top, unless good arguments for them can be made.)
If anyone wants to try this game, feel free to do it in replies to this comment.
Irrationality Game
The Big Bang is not the beginning of the universe, nor is it even analagous to the beginning of the universe. (60% confident)
Time travel is physically possible, and therefore will be achieved someday.
~80%
Irrationality game comment
The correct way to handle Pascal's Mugging and other utilitarian mathematical difficulties is to use a bounded utility function. I'm very metauncertain about this; my actual probability could be anywhere from 10% to 90%. But I guess that my probability is 70% or so.
Irrationality game:
Different levels of description are just that, and are all equally "real". To speak of particles as in statistical mechanics or as in thermodynamics is as correct/real.
The same about the mind, talking as in neurochemistry or as in thoughts is as correct/real.
80% confidence
Dark arts are very toxic, in the sense that you naturally and necessarily use any and all of your relevant beliefs to construct self-serving arguments on most occasions. Moreover, once you happen to successfully use some rationality technique in a self-serving manner, you become more prone to using it in such a way on future occasions. Thus, once you catch other people using dark arts and understand what's going on, you are more likely to use the same tricks yourself. >80% sure (I don't have an intuitive feeling for amounts of evidence, but here I would need at least 6dB of evidence to become uncertain).
I was very interested in the discussions and opinions that grew out of the last time this was played, but find digging through 800+ comments for a new game to start on the same thread annoying. I also don't want this game ruined by a potential sock puppet (whom ever it may be). So here's a non-sockpuppetiered Irrationality Game, if there's still interest. If there isn't, downvote to oblivion!
The original rules:
Enjoy!