How many rationalists does it take to change a lightbulb?
Just one. They’ll take any excuse to change something.
How many effective altruists does it take to screw in a lightbulb?
Actually, it’s far more efficient if you convince someone else to screw it in.
How many Giving What We Can members does it take to change a lightbulb?
Fifteen have pledged to change it later, but we’ll have to wait until they finish grad school.
How many MIRI researchers does it take to screw in a lightbulb?
The problem is that there are multiple ways to parse that, and while it might naively seem like the ambiguity is harmless, it would actually be disastrous if any number of MIRI researchers tried to screw inside of a lightbulb.
How many CFAR instructors does it take to change a lightbulb?
By the time they’re done, the lightbulb should be able to change itself.
How many Leverage Research employees does it take to screw in a lightbulb?
I don’t know, but we have a team working to figure that out.
How many GiveWell employees does it take to change a lightbulb?
Not many. I don't recall the exact number; there’s a writeup somewhere on their site, if you care to check.
How many cryonicists does it take to change a lightbul...
How many neoreactionaries does it take to screw in a lightbulb?
Mu. We should all be using oil lamps instead, as oil lamps have been around for thousands of years, lightbulbs only a hundred. Also, oil lamps won't be affected by an EMP or solar flair. Reliable indoor lighting in general is a major factor in the increase of social degeneracy like nightclubs and premarital sex, and biological disorders like insomnia and depression. Lightbulbs are a cause and effect of social technology being outpaced by material conditions, and their place in society should be thoroughly reexamined, preferably via hundreds of blog posts and a few books. (Tangentially, blacks are five times more likely than whites to hate the smell of kerosene. How interesting.)
Alternatively, if you are already thoroughly pwned and/or gnoned, the answer is one, at a rate of $50 per lightbulb.
Edit: $45 if you or one of your friends has other electric work that could also be done. $40 if you are willing to take lessons later on how to fix your own damn house. $35 if you're willing to move to Idaho. $30 if you give a good reason to only charge $30 a bulb.
Moral Philosopher: How would you characterize irrational behavior?
Economist: When someone acts counter to their preferences.
Moral Philosopher: Oh, that’s what we call virtue.
Three logicians walk into a bar. Bartender asks "Do you all want a drink?". The first says "I don't know", the second says "I don't know", and the third says "yes".
"However, yields an even better joke (due to an extra meta level) when preceded by its quotation and a comma", however, yields an even better joke (due to an extra meta level) when preceded by its quotation and a comma.
"Is a even better joke than the previous joke when preceded by its quotation" is actually much funnier when followed by something completely different.
Q: What's quining?
A: "Is an example, when preceded by its quotation" is an example, when preceded by its quotation.
Three rationalists walk into a bar.
The first one walks up to the bar, and orders a beer.
The second one orders a cider.
The third one says "Obviously you've never heard of Aumann's agreement theorem."
An exponentially large number of Boltzmann Brains experience the illusion of walking into bars, and order a combination of every drink imaginable.
An attractive woman goes into a bar, and enters into a drinking contest with Nick Bostrom. After repeatedly passing out she wakes up the next day with a hangover and a winning lottery ticket.
Three neoreactionaries walk into a bar
"Oh, how I hate these modern sluts" says the first one, watching some girls in miniskirts on the dancefloor "We should return to the 1950s when people acted respectably"
"Pfft, you call yourself reactionary?" replies the second "I idolise 11th century Austria, where people acted respectably and there were no ethnic minorites"
"Ahh, but I am even more reactionary then either of you" boasts the third "I long for classical Greece and Rome, the birthplace of western civilisation, where bisexuality was normal and people used to feast until they vomited!"
Q: Why did the AI researcher die?
A: They were giving a live AI demo and while handing out papers, said "Damn, there are never enough paperclips - I wish I would never run out"
An AI robot and a human are hunting. The human is bitten by a snake, and is no longer breathing. The AI quickly calls 911. It yells "My hunting partner was bitten by a poisonous snake and I think they're dead!" The operator says "Calm down. First, make sure he's dead." A gunshot is heard. "Okay, now they're definitely dead."
Was the joke in that book? I'm pretty sure I've never read it, and I remember coming up with the joke.
Early 80s, I think. "All syllogisms" was one of my first mass-produced button slogans-- the business was started in 1977, but I took some years to start mass producing slogans.
My printing records say that I did 3 print runs in 1988, but that plausibly means that I had been selling the button for a while because I don't think I was doing 3 print runs at a time.
Not an actual joke, but every time I reread Ayn Rand's dictum "check your premises," I can hear in the distance Eliezer Yudkowsky discreetly coughing and muttering, "check your priors."
There was a young man from Peru Whose limericks stopped at line two.
There once was a man from Verdun.
And of course, there's the unfortunate case of the man named Nero...
This isn't exactly rationalist, but it corrlates...
A man with Asperger's walks into a pub. He walks up to the bar, and says "I don't get it, what's the joke?"
Someone was going to tell me a Rationality joke, but the memetic hazard drove them to insanity.
A Bayesian apparently is someone who after a single throw of a coin will believe that it is biased. Based on either outcome.
Also, why do 'Bayes', 'base' and 'bias' sound similar?
Heck, I had to stop and take a pen and paper to figure that out. Turns out, you were wrong. (I expected that, but I wasn't sure how specifically.)
As a simple example, imagine that my prior belief is that 0.1 of coins always provide head, 0.1 of coins always provide tails, and 0.8 of coins are fair. So, my prior belief is that 0.2 of coins are biased.
I throw a coin and it's... let's say... head. What are the posterior probabilities? Multiplying the prior probabilities with the likelihood of this outcome, we get 0.1 × 1, 0.8 × 0.5, and 0.1 × 0. Multiplied and normalized, it is 0.2 for the heads-only coin, and 0.8 for the fair coin. -- My posterior belief remains 0.2 for biased coin, only in this case I know how specifically it is biased.
The same will be true for any symetrical prior belief. For example, if I believe that 0.000001 of coins always provide head, 0.000001 of coins always provide tails, 0.0001 of coins provide head in 80% of cases, 0.0001 of coins provide tails in 80% of cases, and the rest are fair coins... again, after one throw my posterior probability of "a biased coin" will remain exactly the same, only the proportions of specific biases will change.
On the ...
0.1 of coins always provide head
Now there's a way to get people interested in learning probability.
My expectation of "this coin is biased" did not change, but "my expectation of the next result of this coin" changed.
In other words, I changed by expectation that the next flip will be heads, but I didn't change my expectation that from the next 1000 flips approximately 500 will be heads.
Connotationally: If I believe that biased coins are very rare, then my expectation that the next flip will be heads increases only a little. More precisely, if the ratio of biased coins is p, my expectation for the next flip increases at most by approximately p. The update based on one coin flip does not contradict common sense, it is as small as the biased coins are rare; and as large as they are frequent.
"I wonder what is the probability of random sc2 player being into math and cognitive biases"
"It's probably more one-sided than a Möbius strip"
This is a thread for rationality-related or LW-related jokes and humor. Please post jokes (new or old) in the comments.
------------------------------------
Q: Why are Chromebooks good Bayesians?
A: Because they frequently update!
------------------------------------
A super-intelligent AI walks out of a box...
------------------------------------
Q: Why did the psychopathic utilitarian push a fat man in front of a trolley?
A: Just for fun.