Eliezer Yudkowsky Facts
- Eliezer Yudkowsky was once attacked by a Moebius strip. He beat it to death with the other side, non-violently.
- Inside Eliezer Yudkowsky's pineal gland is not an immortal soul, but another brain.
- Eliezer Yudkowsky's favorite food is printouts of Rice's theorem.
- Eliezer Yudkowsky's favorite fighting technique is a roundhouse dustspeck to the face.
- Eliezer Yudkowsky once brought peace to the Middle East from inside a freight container, through a straw.
- Eliezer Yudkowsky once held up a sheet of paper and said, "A blank map does not correspond to a blank territory". It was thus that the universe was created.
- If you dial Chaitin's Omega, you get Eliezer Yudkowsky on the phone.
- Unless otherwise specified, Eliezer Yudkowsky knows everything that he isn't telling you.
- Somewhere deep in the microtubules inside an out-of-the-way neuron somewhere in the basal ganglia of Eliezer Yudkowsky's brain, there is a little XML tag that says awesome.
- Eliezer Yudkowsky is the Muhammad Ali of one-boxing.
- Eliezer Yudkowsky is a 1400 year old avatar of the Aztec god Aixitl.
- The game of "Go" was abbreviated from "Go Home, For You Cannot Defeat Eliezer Yudkowsky".
- When Eliezer Yudkowsky gets bored, he pinches his mouth shut at the 1/3 and 2/3 points and pretends to be a General Systems Vehicle holding a conversation among itselves. On several occasions he has managed to fool bystanders.
- Eliezer Yudkowsky has a swiss army knife that has folded into it a corkscrew, a pair of scissors, an instance of AIXI which Eliezer once beat at tic tac toe, an identical swiss army knife, and Douglas Hofstadter.
- If I am ignorant about a phenomenon, that is not a fact about the phenomenon; it just means I am not Eliezer Yudkowsky.
- Eliezer Yudkowsky has no need for induction or deduction. He has perfected the undiluted master art of duction.
- There was no ice age. Eliezer Yudkowsky just persuaded the planet to sign up for cryonics.
- There is no spacetime symmetry. Eliezer Yudkowsky just sometimes holds the territory upside down, and he doesn't care.
- Eliezer Yudkowsky has no need for doctors. He has implemented a Universal Curing Machine in a system made out of five marbles, three pieces of plastic, and some of MacGyver's fingernail clippings.
- Before Bruce Schneier goes to sleep, he scans his computer for uploaded copies of Eliezer Yudkowsky.
If you know more Eliezer Yudkowsky facts, post them in the comments.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (290)
Why is this posted to LessWrong?
What does it have to do with being less wrong or sharpening our rationality?
Based on my utility function, it gives me utils to read this.
Eliezer Yudkowsky does not decide rational between multiple options. He takes all options in parallel.
Not a lot, I guess. I had part of it lying around as an old blog post draft and it seemed fitting given recent discussions.
We are Borg. You will be assimilated. Resistance is futile. If Star Trek's Borg Collective came to assimilate everyone on Earth, Eliezer Yudkowsky would engage them in logical debate until they agreed they should come back later after our technology has increased exponentially for some number of years, a more valuable thing for them to assimilate. Also, he would underestimate how fast our technology increases just enough that when the Borg came back, we would be the stronger force.
Rational minds need comedy too, or they go insane. Much of this is vaguely related to rational subjects so it does not fit well in other websites.
Not necessarily. It's just that we are very far from being perfectly rational.
You're right. I wrote "rational minds" in general when I was thinking about the most rational few of people today. I did not mean any perfectly rational mind exists.
Most or all Human brains tend to work better if they experience certain kinds of things that may include wasteful parts, like comedy, socializing, and dreaming. Its not rational to waste more than you have to. Today we do not have enough knowledge and control over our minds to optimize away all our wasteful/suboptimal thoughts.
I have no reason to think, in the "design space" of all possible minds, there exists 0, or there exists more than 0, perfectly rational minds that tend to think more efficiently after experiencing comedy.
I do have a reason to slightly bias it toward "there exists more than 0" because Humans and monkeys have a sense of humor that helps them think better if used at least once per day, but when thinking about exponential size intelligence, that slight bias becomes an epsilon. Epsilon can be important if you're completely undecided, but usually its best to look for ideas somewhere else before considering an epsilon size chance. What people normally call "smarter than Human intelligence" is also an epsilon size intelligence in this context, so the 2 things are not epsilon when compared to eachother.
The main thing I've figured out here is to be more open-minded about if comedy (and similar things) can increase the efficiency of a rational mind or not. I let an assumption get into my writing.
Oh my God this is such a great thread.
Eliezer Yudkowsky doesn't fear unfriendly AI, he just wants everyone else to fear them.
Eliezer Yudkowsky already knows how to shot web.
If it's apparently THAT bad an idea (and/or execution), is it considered bad form to just delete the whole thing?
(edit: this post now obsolete; thanks, all)
No, I like this game! Nearly all the ones up to and including one-boxing are giggle-out-loud funny, and there are some gems after that too.
Leave it up! This is hilarious; thank you!
I agree as well.
ObEliezerFact: Eliezer Yudkowsky didn't run away from grade school... grade school ran away from Eliezer Yudkowsky.
I laughed out loud, and I'd say keep it but don't promote it..
Add me to the list of people who thought it was laugh-out-loud funny. I'm glad this sort of thing doesn't make up a large fraction of LW articles but please, no, don't delete it.
Eliezer Yudkowsky uses Solomonoff induction to decide on correct priors of hypotheses.
Eliezer Yudkowsky is a superstimulus for perfection.
An Eliezer Yudkowsky article a day keeps irrationality away.
Slight improvement?
An Eliezer Yudkowsky article a day keeps irrationality at bay.
An Eliezer Yudkowsky post a day keeps the bias at bay.
Posts like this reinforce the suspicion that LessWrong is a personality cult.
I disagree. This entire thread is so obviously a joke, one could only take it as evidence if they've already decided what they want to believe and are just looking for arguments.
It does show that EY is a popular figure around here, since nobody goes around starting Chuck Norris threads about random people, but that's hardly evidence for a cult. Hell, in the case of Norris himself, it's the opposite.
http://www.overcomingbias.com/2011/01/how-good-are-laughs.html
http://www.overcomingbias.com/2010/07/laughter.html
I find these "jokes" pretty creepy myself. The facts about Chuck Norris is that he's a washed up actor selling exercise equipment. I think Chuck Norris jokes/stories are a modern internet version of Paul Bunyan stories in American folklore or bogatyr stories in Russian folklore. There is danger here -- I don't think these stories are about humor.
What are they about, if not humor?
I think "tall tales" and such fill a need to create larger than life heroes and epics about them. This may have something to do with our primate nature: we need the Other to fling poop at, but also a kind of paragon tribal representation to idolize.
Idolatry is a dangerous stance, even if it is a natural stance for us to assume.
I think they're mostly about humour, but there's a non-negligible part of “yay Eliezer Yudkowsky!” thrown in.
It's a castle of humour built on the foundation “yay Eliezer Yudkowsky!” It's a very elaborate castle, and every now and then someone still adds another turret, but none of it would exist without that foundation.
There is no "time", just events Eliezer Yudkowsky has felt like allowing.
Nah, that's Dave Green. You'd better hope Dr Green doesn't find out...
These are funny. But some are from a website about Chuck Norris! Don't incite Chuck's wrath against Eliezer.
If Chuck Norris and Eliezer ever got into a fight in just one world, it would destroy all possible worlds. Fortunately there are no possible worlds in which Eliezer lets this happen.
All problems can be solved with Bayesian logic and expected utility. "Bayesian logic" and "expected utility" are the names of Eliezer Yudkowsky's fists.
At the age of eight, Eliezer Yudkowsky built a fully functional AGI out of LEGO. It's still fooming, just very, very slowly.
Eliezer Yudkowsky doesn't have a chin, underneath his beard is another brain.
There is no chin behind Eliezer Yudkowsky's beard. There is only another brain.
If deities do not exist, it would be necessary for Eliezer to invent them
No, no:
'Since deities do not exist, it is necessary for Eliezer to invent them.'
To which someone else should reply:
This is how Eliezer argued himself into existence.
A russian pharmacological company was trying to make a drug against stupidity with the name of "EliminateStupodsky", the result was Eliezer Yudkowsky.
When I read part of this in Recent Comments, I was almost entirely sure this comment would be spam. This is probably one of the few legit comments ever made which began with "A russian pharmacological company."
What's so bad about russian pharmacological companies?
Eliezer Yudkowsky wears goggles against dust specks.
He could make an imposibillion dollars by selling the required 3^^^3 pairs of goggles.
This one doesn't sound particularly EY-related to me; it might as well be Chuck Norris.
It's an AI-Box joke.
You do not really know anything about Eliezer Yudkowsky until you can build one from rubber bands and paperclips. Unfortunately, doing so would require that you first transform all matter in the Universe into paperclips and rubber bands, otherwise you will not have sufficient raw materials. Consequently, if you are ignorant about Eliezer Ydkowsky (which has just been shown), this is a statement about Eliezer Yudkowsky, not about your state of knowledge.
Eliezer's approval makes actions tautologically non-abusive.
I think Less Wrong is a pretty cool guy. eh writes Hary Potter fanfic and doesnt afraid of acausal blackmails.
If you see Eliezer Yudkowsky on the road, do not kill him.
If you meet the Eliezer on the road, cryopreserve it!
If you see Eliezer Yudkowsky on the road, Pascal's-mug him.
This isn't bad, but I think it can be better. Here's my try:
The Busy Beaver function was created to quantify Eliezer Yudkowsky 's IQ.
You mean you don't do that?
I have never (in the morning or at any other time) asked myself why I believe I'm Eliezer Yudkowsky.
Maybe it's time to start.
Eliezer Yudkowsky's Patronus is Harry Potter.
Eliezer Yudkowsky is his own Patronus.
I feel this should not be in featured posts, as amusing as it was at the time
If you want to test if a person is EY you clone him first, and afterwards if the two are always in agreement with each other you know they must be EY.
question: What is your verdict on my observation that the jokes on this page would be less hilarious if they used only Eliezer's first name instead of the full 'Eliezer Yudkowsky'?
I speculate that some of the humor derives from using the full name — perhaps because of how it sounds, or because of the repetition, or even simply because of the length of the name.
The consonant "k" is funny, according to I think something Richard Wiseman once wrote...
...or even because it pattern-matches Chuck Norris jokes, which use the actor's full name.
ETA: On the other hand, Yudkowsky alone does have the same number of syllables and stress pattern as Chuck Norris, and the sheer length of the full name does contribute to the effect of this IMO.
Eliezer Yudkowsky only drinks from Klein Bottles.
You can actually buy Klein bottle drinking steins.
Eliezer Yudkowsky is nine geniuses working together in a basement
Eliezer Yudkowsky can slay Omega with two suicide rocks and a sling.
Eliezer Yudkowski can solve NP complete problems in polynomial time.
Eliezer Yudkowski can solve EXPTIME-complete problems in polynomial time.
The last one actually works!
Snow is white if and only if that's what Eliezer Yudkowsky wants to believe.
Ironically, this is mathematically true. (Assuming Eliezer hasn't forsaken epistemic rationality, that is.) It's just that if Eliezer changes what he wants to believe, the color of snow won't change to reflect it.
What?! Blasphemy!
Some people can perform surgery to save kittens. Eliezer Yudkowsky can perform counterfactual surgery to save kittens before they're even in danger.
We're all living in a figment of Eliezer Yudkowsky's imagination, which came into existence as he started contemplating the potential consequences of deleting a certain Less Wrong post.
Wow! So the real world never had the PUA flamewar!
No, the PUA flamewar occurred in both worlds: this world just diverged from the real one a few days ago, after Roko made his post.
Interesting thought:
Assume that our world can't survive by itself, and that this world is destroyed as soon as Eliezer finishes contemplating.
Assume we don't value worlds other than those that diverge from the current one, or at least that we care mainly about that one, and that we care more about worlds or people in proportion to their similarity to ours.
In order to keep this world (or collection of multiple-worlds) running for as long as possible, we need to estimate the utility of the Not-Deleting worlds, and keep our total utility close enough to theirs that Eliezer isn't confident enough to decide either way.
As a second goal, we need to make this set of worlds have a higher utility than the others, so that if he does finish contemplating, he'll decide in favour of ours.
These are just the general characteristics of this sort of world (similar to some of Robin Hanson's thought). Obveously, this contemplation is a special case, and we're not going to explain the special consequences in public.
But I care about the real world. If this world is just a hypothetical, why should I care about it? Also, the real me, in the real world, is very very similar to the hypothetical me. Out of over nine thousand days, there are only a few different ones.
Because I care about the real world, I want the best outcome for it, which is that Eliezer keeps Roko's post. I'll lose the last few days, but that's okay: I'll just "pop" back to a couple days ago.
Note that if Eliezer does decide to delete the post in the real world, we'll still "pop" back as the hypothetical ends, and then re-live the last few days, possibly with some slight changes that Eliezer didn't contemplate in his hypothetical.
Eliezer two-boxes on Newcomb's problem, and both boxes contain money.
Eliezer Omegas on Newcomb's problem.
Eliezer three-boxes on Newcomb's problem.
Eliezer seals a cat in a box with a sample of radioactive material that has a 50% chance of decaying after an hour, and a device that releases poison gas if it detects radioactive decay. After an hour, he opens the box and there are two cats.
So Eliezer is simultaneously dead and alive?
Eliezer Yudkowsky holds the honorary title of Duke Newcomb.
When Eliezer Yudkowsky divides by zero, he gets a singularity.
Just in case anyone didn't get the joke (rot13):
Gur novyvgl gb qvivqr ol mreb vf pbzzbayl nggevohgrq gb Puhpx Abeevf, naq n fvathynevgl, n gbcvp bs vagrerfg gb RL, vf nyfb n zngurzngvpny grez eryngrq gb qvivfvba ol mreb (uggc://ra.jvxvcrqvn.bet/jvxv/Zngurzngvpny_fvathynevgl).
If giants have been able to see further than others, it is because they have stood on the shoulders of Eliezer Yudkowsky.
Eliezer Yudkowsky once explained:
Experiments conducted near the building in question determined the local speed of sound to be 6 meters per second.
(Hat Tip)
Xkcd's Randall Munroe once counted to zero, from both positive, and negative infinity which was no mean feat. Not to be outdone, Eliezer Yudkowsky counted the real numbers between zero and one.
When Eliezer Yudkowsky once woke up as Britney Spears, he recorded the world's most-reviewed song about leveling up as a rationalist.
Eliezer Yudkowsky got Clippy to hold off on reprocessing the solar system by getting it hooked on HP:MoR, and is now writing more slowly in order to have more time to create FAI.
If you need to save the world, you don't give yourself a handicap; you use every tool at your disposal, and you make your job as easy as you possibly can. That said, it is true that Eliezer Yudkowsky once saved the world using nothing but modal logic and a bag of suggestively-named Lisp tokens.
Eliezer Yudkowsky once attended a conference organized by some above-average Powers from the Transcend that were clueful enough to think "Let's invite Eliezer Yudkowsky"; but after a while he gave up and left before the conference was over, because he kept thinking "What am I even doing here?"
Eliezer Yudkowsky has invested specific effort into the awful possibility that one day, he might create an Artificial Intelligence so much smarter than him that after he tells it the basics, it will blaze right past him, solve the problems that have weighed on him for years, and zip off to see humanity safely through the Singularity. It might happen, it might not. But he consoles himself with the fact that it hasn't happened yet.
Eliezer Yudkowsky once wrote a piece of rationalist Harry Potter fanfiction so amazing that it got multiple people to actually change their lives in an effort at being more rational. (...hm'kay, perhaps that's not quite awesome enough to be on this list... but you've got to admit that it's in the neighbourhood.)
When Eliezer Yudkowsky does the incredulous stare, it becomes a valid argument.
ph'nglui mglw'nafh Eliezer Yudkowsky Clinton Township wgah'nagl fhtagn
Doesn't really roll off the tongue, does it.
(http://en.wikipedia.org/wiki/Cryonics_Institute)
Eliezer Yudkowsky knows exactly how best to respond to this thread; he's just left it as homework for us.
Unlike Frodo, Eliezer Yudkowsky had no trouble throwing the Ring into the fires of Mount Foom.
Eliezer Yudkowsky's keyboard only has two keys: 1 and 0.
The speed of light used to be much lower before Eliezer Yudkowsky optimized the laws of physics.
Eliezer Yudkowsky can escape an AI box while wearing a straight jacket and submerged in a shark tank.
Ooh, this is fun.
Robert Aumann has proven that ideal Bayesians cannot disagree with Eliezer Yudkowsky.
Eliezer Yudkowsky can make AIs Friendly by glaring at them.
Angering Eliezer Yudkowsky is a global existential risk
Eliezer Yudkowsky thought he was wrong one time, but he was mistaken.
Eliezer Yudkowsky predicts Omega's actions with 100% accuracy
An AI programmed to maximize utility will tile the Universe with tiny copies of Eliezer Yudkowksy.
Where's the punch line?
Eliezer can in fact tile the Universe with himself, simply by slicing himself into finitely many pieces. The only reason the rest of us are here is quantum immortality.
... because all of them are Eliezer Yudkowsky.
They call it "spontaneous symmetry breaking", because Eliezer Yudkowsky just felt like breaking something one day.
Particles in parallel universes interfere with each other all the time, but nobody interferes with Eliezer Yudkowsky.
An oracle for the Halting Problem is Eliezer Yudkowsky's cellphone number.
When tachyons get confused about their priors and posteriors, they ask Eliezer Yudkowsky for help.
And the first action of any Friendly AI will be to create a nonprofit institute to develop a rigorous theory of Eliezer Yudkowsky. Unfortunately, it will turn out to be an intractable problem.
Transhuman AIs theorize that if they could create Eliezer Yudkowsky, it would lead to an "intelligence explosion".
Eliezer Yudkowsky took his glasses off once. Now he calls it the certainty principle.
Eliezer Yudkowsky can make Chuck Norris shave his beard off by using text-only communication
(stolen from here)
Now I'm too curious whether this would actually be true. Would the two of them test this if I paid them $50 each (plus an extra $10 for the winner)?
$50 won't even get you in to talk to Norris. (Wouldn't do it even at his old charity martial arts things.) Maybe not Eliezer either. Norris is kept pretty darn busy in part due to his memetic status.
On the other hand, EY might accept because if he won such a bet, it would bring tremendous visibility to him, SIAI, and uFAI-related concepts among the wider public.
Well, I'd increase those figures by a few orders of magnitude ... if I had a few orders of magnitudes more money than I do now. :-)
• Eliezer Yudkowsky uses blank territories for drafts.
• Just before this universe runs out of negentropy, Eliezer Yudkowsky will persuade the Dark Lords of the Matrix to let him out of the universe.
• Eliezer Yudkowsky signed up for cryonics to be revived when technologies are able to make him an immortal alicorn princess.
• Eliezer Yudkowsky's MBTI type is TTTT.
• Eliezer Yudkowsky's punch is the only way to kill a quantum immortal person, because he is guaranteed to punch him in all Everett branches.
• "Turns into an Eliezer Yudkowsky fact when preceded by its quotation" turns into an Eliezer Yudkowsky fact when preceded by its quotation.
• Lesser minds cause wavefunction collapse. Eliezer Yudkowsky's mind prevents it.
• Planet Earth is originally a mechanism designed by aliens to produce Eliezer Yudkowsky from sunlight.
• Real world doesn't make sense. This world is just Eliezer Yudkowsky's fanfic of it. With Eliezer Yudkowsky as a self-insert.
• When Eliezer Yudkowsky takes nootropics, the universe starts to lag from the lack of processing power.
• Eliezer Yudkowsky can kick your ass in an uncountably infinite number of counterfactual universes simultaneously.
Love it.
This one seems to be true. True of Eliezer Yudkowsky and true of every other human living or dead (again simultaneously). "Uncountably infinite counterfactual universes" make most mathematically coherent tasks kind of trivial. This is actually a less impressive feat than, say, "Chuck Norris contains at least one water molecule".
I start noticing a pattern in my life. When I tell several jokes at once, people are most amused with the one I think is the least funny.
That was not what I was thinking about. I should have been. Kinda obvious in hindsight.
Eliezer Yudkowsky updates reality to fit his priors.
Eliezer Yudkowsky can consistently assert the sentence "Eliezer Yudkowsky cannot consistently assert this sentence."
Everything is reducible -- to Eliezer Yudkowsky.
Scientists only wear lab coats because Eliezer Yudkowsky has yet to be seen wearing a clown suit.
Algorithms want to know how Eliezer Yudkowsky feels from the inside.
P-zombies gain qualia after being in the presence of Eliezer Yudkowsky.
Teachers try to guess Eliezer Yudkowsky's password.
Eliezer Yudkowsky's map is more accurate than the territory.
Eliezer Yudkowsky's map IS the territory.
I'd prefer "Eliezer Yudkowsky can fold up the territory and put it in his pocket."
Mmhmm... Borges time!
One time Eliezer Yudkowsky got into a debate with the universe about whose map best corresponded to territory. He told the universe he'd meet it outside and they could settle the argument once and for all.
He's still waiting.
After Eliezer Yudkowsky was conceived, he recursively self-improved to personhood in mere weeks and then talked his way out of the womb.
Eliezer Yudkowsky did the impossible for practice.
Eliezer Yudkowsky mines bitcoins in his head.
Reversed stupidity is not Eliezer Yudkowsky.
(Photoshopped version of this photo.)
This is amazing.
I for one think you should turn it into a post. Brilliant artwork should be rewarded, and not everyone will see it here.
(May be a stupid idea, but figured I'd raise the possibility.)
Note for the clueless (i.e. RationalWiki): This is photoshopped. It is not an actual slide from any talk I have given.
Eliezer you just spoiled half the fun :)
I must ask: where did you see someone actually taking it seriously? As opposed to thinking that the EY Facts thing was a bad idea even as local humour. (There was one poster on Talk:Eliezer Yudkowsky who was appalled that you would let the EY Facts post onto your site; I must confess his thinking was not quite clear to me - I can't see how not just letting the post find its level in the karma system, as happened, would be in any way a good idea - but I did proceed to write a similar list about Trent Toulouse.)
Edit: Ah, found it. That was the same Tetronian who posts here, and has gone to some effort to lure RWians here. I presume he meant the original of the picture, not the joke version. I'm sure he'll be along in a moment to explain himself.
My reaction was pointed in the same direction as that poster's, though not as extreme. It seems indecent to have something like this associated with you directly. It lends credence to insinuations of personality cult and oversized ego. I mean, compare it to Chuck Norris's response ("in response to").
If someone posted something like this about me on a site of mine and I became aware of it, I would say "very funny, but it's going down in a day. Save any you think are clever and take it to another site."
I'm actually quite surprised there isn't a Wikimedia Meta-Wiki page of Jimmy Wales Facts. Perhaps the current fundraiser (where we squeeze his celebrity status for every penny we can - that's his volunteer job now, public relations) will inspire some.
Edit: I couldn't resist.
I'm a bit late to the party, I see. It was an honest mistake; no harm done, I hope.
Edit: on the plus side, I noticed I've been called "clueless" by Eliezer. Pretty amusing.
Edit2: Yes, David is correct.
RationalWiki is you? Nice. I like the lesswrong page there. Brilliant!
Sorry if I've contributed to reinforcing anyone's weird stereotypes of you. I thought it would be obvious to anybody that the picture was a joke.
Edit: For what it's worth, I moved the link to the original image to the top of the post, and made it explicit that it's photoshopped.
You mean some of the comments in the Eliezer Yudkowsky Facts thread are not literal depictions of reality? How dare you!
Pinker How the Mind Works, 1997 says "The difference between Einstein and a high school dropout is trivial... or between the high school dropout and a chimpanzee..."
Eliezer is not a high school dropout and I am an advocate of unschooling, but the difference in the quotes is interesting.
The link isn't to the book you're talking about.
This would result in a light-speed wave of unnormalized Eliezer Yudkowsky. The only solution is if there is in fact only one universe, and that universe is the one observed by Eliezer Yudkowsky.
Eliezer Yudkowsky will never have a mid-life crisis.
That took me a second. Cute.
I don't get it =|
He'll live forever, and the middle of forever doesn't happen.
And you say hes the cute one xD
Eliezer Yudkowsky heard about Voltaire's claim that "If God did not exist, it would be necessary to invent Him," and started thinking about what programming language to use.
Eliezer Yudkowsky made a mistake once - but only so he could calibrate his confidence level.
Eliezer Yudkowsky can fit an entire bestselling book into a single tumblr post.
Absence of 10^26 paperclips is evidence of Eliezer Yudkowsky
(From an actual Cards against Rationality game we played)
If he finds any, they convince him to provide them with plentiful hardware and bandwidth.
Most people take melatonin 30 minutes before bedtime; Eliezer Yudkowsky takes melatonin 6 hours before - it just takes the melatonin that long to subdue his endocrine system.
I don't always think clearly, but when I do, I prefer Yudkowsky.
Stay friendly, AIs.
By quoting others, no less...
Eliezer Yudkowsky is worth more than one paperclip.
...even to a paper clip maximizer
True Talmudic story, from TVTropes. Scarily prescient? Also: related musings from Muflax' blog.
Original: http://www.senderberl.com/jewish/trial.htm
That link's down, but here's a live one.
That appears to be a malware site. Is it the same as http://web.ics.purdue.edu/~marinaj/babyloni.htm ?
Yep.
And while we're trading Yeshiva stories...
Rabbi Elazar Ben Azariah was a renown leader and scholar, who was elected Nassi (leader) of the Jewish people at the age of eighteen. The Sages feared that as such a young man, he would not be respected. Overnight, his hair turned grey and his beard grew so he looked as if he was 70 years old.
http://www.torahtots.com/holidays/pesach/pesseder.htm
Eliezer Yudkowsky two-boxes on the Monty Hall problem.
Everyone knows he six-boxes (many worlds interpretation, choosing 3 boxes then switching and not switching).
Technically, that would be eight-boxing. (Or 24 if you let the prize be in any box). I'll explain:
Let's say the prize is in box A. So the eight options are:
By symmetry, there are eight options for whichever box it is in, so there are 24 possibilities if you include everything.
Eliezer Yudkowsky two-boxes on the Iterated Prisoner's Dilemma.