Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Eliezer Yudkowsky Facts

114 Post author: steven0461 22 March 2009 08:17PM
  • Eliezer Yudkowsky was once attacked by a Moebius strip. He beat it to death with the other side, non-violently.
  • Inside Eliezer Yudkowsky's pineal gland is not an immortal soul, but another brain.
  • Eliezer Yudkowsky's favorite food is printouts of Rice's theorem.
  • Eliezer Yudkowsky's favorite fighting technique is a roundhouse dustspeck to the face.
  • Eliezer Yudkowsky once brought peace to the Middle East from inside a freight container, through a straw.
  • Eliezer Yudkowsky once held up a sheet of paper and said, "A blank map does not correspond to a blank territory". It was thus that the universe was created.
  • If you dial Chaitin's Omega, you get Eliezer Yudkowsky on the phone.
  • Unless otherwise specified, Eliezer Yudkowsky knows everything that he isn't telling you.
  • Somewhere deep in the microtubules inside an out-of-the-way neuron somewhere in the basal ganglia of Eliezer Yudkowsky's brain, there is a little XML tag that says awesome.
  • Eliezer Yudkowsky is the Muhammad Ali of one-boxing.
  • Eliezer Yudkowsky is a 1400 year old avatar of the Aztec god Aixitl.
  • The game of "Go" was abbreviated from "Go Home, For You Cannot Defeat Eliezer Yudkowsky".
  • When Eliezer Yudkowsky gets bored, he pinches his mouth shut at the 1/3 and 2/3 points and pretends to be a General Systems Vehicle holding a conversation among itselves. On several occasions he has managed to fool bystanders.
  • Eliezer Yudkowsky has a swiss army knife that has folded into it a corkscrew, a pair of scissors, an instance of AIXI which Eliezer once beat at tic tac toe, an identical swiss army knife, and Douglas Hofstadter.
  • If I am ignorant about a phenomenon, that is not a fact about the phenomenon; it just means I am not Eliezer Yudkowsky.
  • Eliezer Yudkowsky has no need for induction or deduction. He has perfected the undiluted master art of duction.
  • There was no ice age. Eliezer Yudkowsky just persuaded the planet to sign up for cryonics.
  • There is no spacetime symmetry. Eliezer Yudkowsky just sometimes holds the territory upside down, and he doesn't care.
  • Eliezer Yudkowsky has no need for doctors. He has implemented a Universal Curing Machine in a system made out of five marbles, three pieces of plastic, and some of MacGyver's fingernail clippings.
  • Before Bruce Schneier goes to sleep, he scans his computer for uploaded copies of Eliezer Yudkowsky.

If you know more Eliezer Yudkowsky facts, post them in the comments.

Comments (285)

Comment author: Spurlock 27 February 2012 04:50:51PM 32 points [-]

Eliezer Yudkowsky two-boxes on the Monty Hall problem.

Comment author: fubarobfusco 28 February 2012 02:49:55AM 11 points [-]

Eliezer Yudkowsky two-boxes on the Iterated Prisoner's Dilemma.

Comment author: Dmytry 27 February 2012 04:53:02PM *  3 points [-]

Everyone knows he six-boxes (many worlds interpretation, choosing 3 boxes then switching and not switching).

Comment author: Multiheaded 15 June 2012 09:04:02AM *  31 points [-]

Rabbi Eliezer was in an argument with five fellow rabbis over the proper way to perform a certain ritual. The other five Rabbis were all in agreement with each other, but Rabbi Eliezer vehemently disagreed. Finally, Rabbi Nathan pointed out "Eliezer, the vote is five to one! Give it up already!" Eliezer got fed up and said "If I am right, may God himself tell you so!" Thunder crashed, the heavens opened up, and the voice of God boomed down. "YES," said God, "RABBI ELIEZER IS RIGHT. RABBI ELIEZER IS PRETTY MUCH ALWAYS RIGHT." Rabbi Nathan turned and conferred with the other rabbis for a moment, then turned back to Rabbi Eliezer. "All right, Eliezer," he said, "the vote stands at five to TWO."

True Talmudic story, from TVTropes. Scarily prescient? Also: related musings from Muflax' blog.

Comment author: Eliezer_Yudkowsky 15 June 2012 10:52:16PM 8 points [-]
Comment author: Dr_Manhattan 15 June 2012 11:19:05PM 5 points [-]

And while we're trading Yeshiva stories...

Rabbi Elazar Ben Azariah was a renown leader and scholar, who was elected Nassi (leader) of the Jewish people at the age of eighteen. The Sages feared that as such a young man, he would not be respected. Overnight, his hair turned grey and his beard grew so he looked as if he was 70 years old.

http://www.torahtots.com/holidays/pesach/pesseder.htm

Comment author: JoshuaFox 11 January 2013 09:57:10AM 4 points [-]

That link's down, but here's a live one.

Comment author: ErikM 10 September 2012 06:35:20AM 4 points [-]

That appears to be a malware site. Is it the same as http://web.ics.purdue.edu/~marinaj/babyloni.htm ?

Comment author: Eliezer_Yudkowsky 10 September 2012 10:45:17AM 3 points [-]

Yep.

Comment author: Emile 29 January 2013 01:45:00PM *  24 points [-]
Comment author: Andreas_Giger 29 January 2013 02:08:59PM 2 points [-]

By quoting others, no less...

Comment author: gwern 18 January 2014 01:23:41AM 19 points [-]

Most people take melatonin 30 minutes before bedtime; Eliezer Yudkowsky takes melatonin 6 hours before - it just takes the melatonin that long to subdue his endocrine system.

Comment author: PhilGoetz 23 March 2009 02:19:01AM *  85 points [-]

Eliezer Yudkowsky made a mistake once - but only so he could calibrate his confidence level.

Comment author: PhilGoetz 23 March 2009 12:04:52PM *  121 points [-]
  • If you put Eliezer Yudkowsky in a box, the rest of the universe is in a state of quantum superposition until you open it again.
  • Eliezer Yudkowsky can prove it's not butter.
  • If you say Eliezer Yudkowsky's name 3 times out loud, it prevents anything magical from happening.
Comment author: marchdown 27 November 2010 01:46:12AM 65 points [-]

This last one actually works!

Comment author: Vivi 15 September 2011 10:56:44PM 1 point [-]

Wouldn't that be a case of belief in belief though?

Comment author: Giles 08 June 2011 02:45:13AM 69 points [-]

Eliezer Yudkowsky will never have a mid-life crisis.

Comment author: Alicorn 08 June 2011 03:01:35AM 17 points [-]

That took me a second. Cute.

Comment author: Gust 12 December 2011 03:09:52AM 3 points [-]

I don't get it =|

Comment author: Alicorn 12 December 2011 03:23:37AM 22 points [-]

He'll live forever, and the middle of forever doesn't happen.

Comment author: ata 10 March 2010 05:44:43PM *  100 points [-]

(Photoshopped version of this photo.)

The scale of intelligent minds

Comment author: Eliezer_Yudkowsky 24 November 2010 02:19:49AM 49 points [-]

Note for the clueless (i.e. RationalWiki): This is photoshopped. It is not an actual slide from any talk I have given.

Comment author: XiXiDu 24 November 2010 12:21:02PM *  25 points [-]

Note for the clueless (i.e. RationalWiki): This is photoshopped. It is not an actual slide from any talk I have given.

Here is a real photo if you need one ;-)

Comment author: TheOtherDave 02 December 2010 08:09:17PM 15 points [-]

Note for the clueless (i.e. RationalWiki):

I've been trying to decide for a while now whether I believe you meant "e.g." I'm still not sure.

Comment author: Eliezer_Yudkowsky 03 December 2010 02:16:44AM 20 points [-]

RationalWiki was the only place I saw this mistake made, so the i.e. seemed deserved to me.

Comment author: XiXiDu 03 December 2010 12:37:11PM *  27 points [-]

It looks like it turned awful since I've read it the last time:

This essay, while entertaining and useful, can be seen as Yudkowsky trying to reinvent the sense of awe associated with religious experience in the name of rationalism. It's even available in tract format.

The most fatal mistake of the entry in its current form seems to be that it does lump together all of Less Wrong and therefore does stereotype its members. So far this still seems to be a community blog with differing opinions. I got a Karma score of over 1700 and I have been criticizing the SIAI and Yudkowsky (in a fairly poor way).

I hope you people are reading this. I don't see why you draw a line between you and Less Wrong. This place is not an invite-only party.

LessWrong is dominated by Eliezer Yudkowsky, a research fellow for the Singularity Institute for Artificial Intelligence.

I don't think this is the case anymore. You can easily get Karma by criticizing him and the SIAI. Most of all new posts are not written by him anymore either.

Members of the Less Wrong community are expected to be on board with the singularitarian/transhumanist/cryonics bundle.

Nah!

If you indicate your disagreement with the local belief clusters without at least using their jargon, someone may helpfully suggest that "you should try reading the sequences" before you attempt to talk to them.

I don't think this is asked too much. As the FAQ states:

Why do you all agree on so much? Am I joining a cult?

We have a general community policy of not pretending to be open-minded on long-settled issues for the sake of not offending people. If we spent our time debating the basics, we would never get to the advanced stuff at all.

It's unclear whether Descartes, Spinoza or Leibniz would have lasted a day without being voted down into oblivion.

So? I don't see what this is supposed to prove.

Indeed, if anyone even hints at trying to claim to be a "rationalist" but doesn't write exactly what is expected, they're likely to be treated with contempt.

Provide some references here.

Some members of this "rationalist" movement literally believe in what amounts to a Hell that they will go to if they get artificial intelligence wrong in a particularly disastrous way.

I've been criticizing the subject matter and got upvoted for it, as you obviously know since you linked to my comments as reference. Further I never claimed that the topic is unproblematic or irrational but that I was fearing unreasonable consequences and that I have been in disagreement about how the content was handled. Yet I do not agree with your portrayal insofar that it is not something that fits a Wiki entry about Less Wrong. Because something sounds extreme and absurd it is not wrong. In theory there is nothing that makes the subject matter fallacious.

Yudkowsky has declared the many worlds interpretation of quantum physics is correct, despite the lack of testable predictions differing from the Copenhagen interpretation, and despite admittedly not being a physicist.

I haven't read the quantum physics sequence but by what I have glimpsed this is not the crucial point that distinguishes MWI from other interpretations. That's why people suggest one should read the material before criticizing it.

P.S. I'm curious if you know of a more intelligent and rational community than Less Wrong? I don't! Proclaiming that Less Wrong is more rational than most other communities isn't necessarily factually wrong.

Edit: "[...] by what I have glimpsed this is just wrong." now reads "[...] by what I have glimpsed this is not the crucial point that distinguishes MWI from other interpretations."

Comment author: Jack 21 December 2010 08:52:09PM 24 points [-]

Yudkowsky has declared the many worlds interpretation of quantum physics is correct, despite the lack of testable predictions differing from the Copenhagen interpretation, and despite admittedly not being a physicist.

I think there is a fair chance the many world's interpretation is wrong but anyone who criticizes it by defending the Copenhagen 'interpretation' has no idea what they're talking about.

Comment author: ArisKatsaris 03 December 2010 01:14:14PM 14 points [-]

I haven't read the quantum physics sequence but by what I have glimpsed this is just wrong. That's why people suggest one should read the material before criticizing it.

Irony.

Xixidu, you should also read the material before trying to defend it.

Comment author: wedrifid 03 December 2010 01:50:05PM 19 points [-]

It's unclear whether Descartes, Spinoza or Leibniz would have lasted a day without being voted down into oblivion.

So? I don't see what this is supposed to prove.

I know, I loved that quote. I just couldn't work out why it was presented as a bad thing.

Comment author: Jack 21 December 2010 08:55:46PM 12 points [-]

Descartes is maybe the single best example of motivated cognition in the history of Western thought. Though interestingly, there are some theories that he was secretly an atheist.

I assume their point has something to do with those three being rationalists in the traditional sense... but I don't think Rational Wiki is using the word in the traditional sense either. Would Descartes have been allowed to edit an entry on souls?

Comment author: PhilGoetz 27 September 2011 03:30:01AM 2 points [-]

You think the average person on LessWrong ranks with Spinoza and Leibniz? I disagree.

Comment author: JoshuaZ 27 September 2011 03:33:25AM *  15 points [-]

Do you mean Spinoza or Leibniz given their knowledge base and upbringing or the same person with a modern environment? I know everything Leibniz knew and a lot more besides. But I suspect that if the same individual grew up in a modern family environment similar to my own he would have accomplished a lot more than I have at the same age.

Comment author: Jack 27 September 2011 04:00:13AM 2 points [-]

the same person with a modern environment

They wouldn't be the same person. Which is to say, the the whole matter is nonsense as the other replies in this thread made clear.

Comment author: JoshuaZ 27 September 2011 04:09:29AM 6 points [-]

Sorry, I thought the notion was clear that one would be talking about same genetics but different environment. Illusion of transparency and all that. Explicit formulation: if one took a fertilized egg with Leibniz's genetic material and raised in an American middle class family with high emphasis on intellectual success, I'm pretty sure he would have by the time he got to my age have accomplished more than I have. Does that make the meaning clear?

Comment author: wedrifid 27 September 2011 04:30:34AM *  10 points [-]

You think the average person on LessWrong ranks with Spinoza and Leibniz? I disagree.

Wedrifid_2010 was not assigning a status ranking or even an evaluation of overall intellectual merit or potential. For that matter predicting expected voting patterns is a far different thing than assigning a ranking. People with excessive confidence in habitual thinking patterns that are wrong or obsolete will be downvoted into oblivion where the average person is not, even if the former is more intelligent or more intellectually impressive overall.

I also have little doubt that any of those three would be capable of recovering from their initial day or three of spiraling downvotes assuming they were willing to ignore their egos, do some heavy reading of the sequences and generally spend some time catching up on modern thought. But for as long as those individuals were writing similar material to that which identifies them they would be downvoted by lesswrong_2010. Possibly even by lesswrong_now too.

Comment author: Normal_Anomaly 21 December 2010 08:37:26PM 1 point [-]

Yes. Upvotes come from original, insightful contributions. Descartes', Spinoza's, and Liebnitz's ideas are hundreds of years old and dated.

Comment author: RobinZ 22 December 2010 01:31:51AM 5 points [-]

Not exactly the point - I think the claim is that they would be downvoted even if they were providing modern, original content ... which I would question, even then. We've had quite successful theist posters before, for example.

Comment author: Jack 22 December 2010 01:41:19AM 6 points [-]

I think the claim is that they would be downvoted even if they were providing modern, original content

What would this even mean? Like, if they were transported forward in time and formed new beliefs on the basis of modern science? If they were cloned from DNA surviving in their bone marrow and then adopted by modern, secular families, took AP Calculus and learned to program?

What a goofy thing to even be talking about.

Comment author: wedrifid 22 December 2010 02:12:11AM 5 points [-]

Not exactly the point - I think the claim is that they would be downvoted even if they were providing modern, original content ... which I would question, even then.

I would downvote Descartes based on the quality of his thinking and argument even if it was modern bad thinking. At least I would if he persisted with the line after the first time or two he was corrected. I suppose this is roughly equivalent to what you are saying.

Comment author: David_Gerard 24 November 2010 02:43:53PM *  5 points [-]

I must ask: where did you see someone actually taking it seriously? As opposed to thinking that the EY Facts thing was a bad idea even as local humour. (There was one poster on Talk:Eliezer Yudkowsky who was appalled that you would let the EY Facts post onto your site; I must confess his thinking was not quite clear to me - I can't see how not just letting the post find its level in the karma system, as happened, would be in any way a good idea - but I did proceed to write a similar list about Trent Toulouse.)

Edit: Ah, found it. That was the same Tetronian who posts here, and has gone to some effort to lure RWians here. I presume he meant the original of the picture, not the joke version. I'm sure he'll be along in a moment to explain himself.

Comment author: ata 24 November 2010 05:11:54PM 6 points [-]

I presume he meant the original of the picture, not the joke version.

"having watched the speech that the second picture is from, I can attest that he meant it as a joke" does sound like he's misremembering the speech as having actually included that.

Comment author: [deleted] 25 July 2011 12:21:15AM *  2 points [-]

I'm a bit late to the party, I see. It was an honest mistake; no harm done, I hope.

Edit: on the plus side, I noticed I've been called "clueless" by Eliezer. Pretty amusing.

Edit2: Yes, David is correct.

Comment author: wedrifid 25 July 2011 02:02:03AM 3 points [-]

Edit: on the plus side, I noticed I've been called "clueless" by Eliezer. Pretty amusing.

RationalWiki is you? Nice. I like the lesswrong page there. Brilliant!

Comment author: [deleted] 25 July 2011 02:53:35AM *  5 points [-]

I started the article way back in May of 2010, at which point I viewed LW as weird and unsettling rather than awesome. As you can see, though, David_Gerard and others have made the article significantly better since then.

Comment author: Vaniver 24 November 2010 08:26:50PM 2 points [-]

I must confess his thinking was not quite clear to me - I can't see how not just letting the post find its level in the karma system, as happened, would be in any way a good idea

My reaction was pointed in the same direction as that poster's, though not as extreme. It seems indecent to have something like this associated with you directly. It lends credence to insinuations of personality cult and oversized ego. I mean, compare it to Chuck Norris's response ("in response to").

If someone posted something like this about me on a site of mine and I became aware of it, I would say "very funny, but it's going down in a day. Save any you think are clever and take it to another site."

Comment author: David_Gerard 24 November 2010 09:29:57PM *  8 points [-]

I'm actually quite surprised there isn't a Wikimedia Meta-Wiki page of Jimmy Wales Facts. Perhaps the current fundraiser (where we squeeze his celebrity status for every penny we can - that's his volunteer job now, public relations) will inspire some.

Edit: I couldn't resist.

Comment author: steven0461 24 November 2010 11:10:02PM 2 points [-]

Would it help if I added a disclaimer to the effect that "this was an attempt at mindless nerd amusement, not worship or mockery"? If there's a general sense that people are taking the post the wrong way and it's hurting reputations, I'm happy to take it down entirely.

Comment author: David_Gerard 24 November 2010 11:16:42PM 7 points [-]

I really wouldn't bother. Anyone who doesn't like these things won't be mollified.

Comment author: ata 24 November 2010 05:22:23PM *  9 points [-]

Sorry if I've contributed to reinforcing anyone's weird stereotypes of you. I thought it would be obvious to anybody that the picture was a joke.

Edit: For what it's worth, I moved the link to the original image to the top of the post, and made it explicit that it's photoshopped.

Comment author: XiXiDu 24 November 2010 06:52:28PM *  9 points [-]

No sane person would proclaim something like that. If one does not know the context and one doesn't know who Eliezer Yudkowsky is one should however conclude that it is reasonable to assume that the slide was not meant to be taken seriously (e.g. is a joke).

Extremely exaggerated manipulations are in my opinion no deception, just fun.

Comment author: wedrifid 03 December 2010 03:20:06PM 14 points [-]

You mean some of the comments in the Eliezer Yudkowsky Facts thread are not literal depictions of reality? How dare you!

Comment author: ata 03 December 2010 05:36:07PM 6 points [-]

Yep, it turns out that Eliezer is not literally the smartest, most powerful, most compassionate being in the universe. A bit of a letdown, isn't it? I know a lot of people expected better of him.

Comment author: Risto_Saarelma 24 November 2010 06:36:01PM 3 points [-]

That might be underestimating the power of lack of context.

Comment author: roland 02 November 2012 10:25:23PM 1 point [-]

Eliezer you just spoiled half the fun :)

Comment author: JoshuaFox 11 March 2012 01:38:32PM *  3 points [-]

Pinker How the Mind Works, 1997 says "The difference between Einstein and a high school dropout is trivial... or between the high school dropout and a chimpanzee..."

Eliezer is not a high school dropout and I am an advocate of unschooling, but the difference in the quotes is interesting.

Comment author: Will_Newsome 10 March 2010 11:36:59PM 2 points [-]

This is amazing.

I for one think you should turn it into a post. Brilliant artwork should be rewarded, and not everyone will see it here.

(May be a stupid idea, but figured I'd raise the possibility.)

Comment author: LucasSloan 11 March 2010 01:25:42AM 18 points [-]

It's good, but we should retain the top level post for things that are truly important.

Comment author: ata 11 March 2010 01:50:19AM *  12 points [-]

Thanks! Glad people like it, but I'll have to agree with Lucas — I prefer top-level posts to be on-topic, in-depth, and interesting (or at least two of those), and as I expect others feel the same way, I don't want a more worthy post to be pushed off the bottom of the list for the sake of a funny picture.

Comment author: ChrisHallquist 20 January 2013 07:11:44PM 30 points [-]

Eliezer Yudkowsky heard about Voltaire's claim that "If God did not exist, it would be necessary to invent Him," and started thinking about what programming language to use.

Comment author: Mqrius 09 January 2013 02:23:55AM 10 points [-]

Eliezer Yudkowsky is worth more than one paperclip.

Comment author: shminux 09 January 2013 02:40:25AM *  14 points [-]

...even to a paper clip maximizer

Comment author: Peter_de_Blanc 25 March 2009 03:10:32AM 44 points [-]

Reversed stupidity is not Eliezer Yudkowsky.

Comment author: Cyan 24 March 2009 03:15:36AM *  28 points [-]
  • a mixture of two parts Red Bull to one part Eliezer Yudkowsky creates a universal question solvent.
  • Eliezer Yudkowsky experiences all paths through configuration space because he only constructively interferes with himself.
  • Eliezer Yudkowsky's mental states are not ontologically fundamental, but only because he chooses so of his own free will.
Comment author: Manfred 25 July 2011 06:22:55AM 7 points [-]

Eliezer Yudkowsky experiences all paths through configuration space because he only constructively interferes with himself.

This would result in a light-speed wave of unnormalized Eliezer Yudkowsky. The only solution is if there is in fact only one universe, and that universe is the one observed by Eliezer Yudkowsky.

Comment author: ata 30 October 2011 04:41:27PM *  64 points [-]

After Eliezer Yudkowsky was conceived, he recursively self-improved to personhood in mere weeks and then talked his way out of the womb.

Comment author: Wei_Dai 24 July 2009 02:55:20AM *  50 points [-]
  • After the truth destroyed everything it could, the only thing left was Eliezer Yudkowsky.
  • In his free time, Eliezer Yudkowsky likes to help the Halting Oracle answer especially difficult queries.
  • Eliezer Yudkowsky actually happens to be the pinnacle of Intelligent Design. He only claims to be the product of evolution to remain approachable to the rest of us.
  • Omega did its Ph.D. thesis on Eliezer Yudkowsky. Needless to say, it's too long to be published in this world. Omega is now doing post-doctoral research, tentatively titled "Causality vs. Eliezer Yudkowsky - An Indistinguishability Argument".
Comment author: Wei_Dai 24 July 2009 11:02:03PM 68 points [-]
  • It was easier for Eliezer Yudkowsky to reformulate decision theory to exclude time than to buy a new watch.
  • Eliezer Yudkowsky's favorite sport is black hole diving. His information density is so great that no black hole can absorb him, so he just bounces right off the event horizon.
  • God desperately wants to believe that when Eliezer Yudkowsky says "God doesn't exist," it's just good-natured teasing.
  • Never go in against Eliezer Yudkowsky when anything is on the line.
Comment author: orthonormal 31 July 2010 06:29:56PM 48 points [-]

Eliezer Yudkowsky can consistently assert the sentence "Eliezer Yudkowsky cannot consistently assert this sentence."

Comment author: badger 22 March 2009 09:37:02PM *  45 points [-]

Everything is reducible -- to Eliezer Yudkowsky.

Scientists only wear lab coats because Eliezer Yudkowsky has yet to be seen wearing a clown suit.

Algorithms want to know how Eliezer Yudkowsky feels from the inside.

Comment author: badger 22 March 2009 09:41:53PM 42 points [-]

Teachers try to guess Eliezer Yudkowsky's password.

Comment author: Yvain 22 March 2009 09:53:13PM *  84 points [-]

Eliezer Yudkowsky's map is more accurate than the territory.

Comment author: SilasBarta 16 June 2009 07:31:59PM 17 points [-]

One time Eliezer Yudkowsky got into a debate with the universe about whose map best corresponded to territory. He told the universe he'd meet it outside and they could settle the argument once and for all.

He's still waiting.

Comment author: Lightwave 24 July 2009 02:36:22PM 9 points [-]

Eliezer Yudkowsky's map IS the territory.

Comment author: khafra 16 April 2010 11:52:44AM 44 points [-]

I'd prefer "Eliezer Yudkowsky can fold up the territory and put it in his pocket."

Comment author: Lambda 04 February 2012 09:44:51AM 6 points [-]

Mmhmm... Borges time!

In that Empire, the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City, and the map of the Empire, the entirety of a Province. In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it. The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast Map was Useless, and not without some Pitilessness was it, that they delivered it up to the Inclemencies of Sun and Winters. In the Deserts of the West, still today, there are Tattered Ruins of that Map, inhabited by Animals and Beggars; in all the Land there is no other Relic of the Disciplines of Geography.

—Jorge Luis Borges, "On Exactitude in Science"

Comment author: badger 22 March 2009 09:53:31PM 14 points [-]

P-zombies gain qualia after being in the presence of Eliezer Yudkowsky.

Comment author: Yvain 22 March 2009 09:07:07PM *  91 points [-]

Ooh, this is fun.

Robert Aumann has proven that ideal Bayesians cannot disagree with Eliezer Yudkowsky.
Eliezer Yudkowsky can make AIs Friendly by glaring at them.
Angering Eliezer Yudkowsky is a global existential risk
Eliezer Yudkowsky thought he was wrong one time, but he was mistaken.
Eliezer Yudkowsky predicts Omega's actions with 100% accuracy
An AI programmed to maximize utility will tile the Universe with tiny copies of Eliezer Yudkowksy.

Comment author: SoullessAutomaton 23 March 2009 02:30:29AM 43 points [-]

Eliezer Yudkowsky can make AIs Friendly by glaring at them.

And the first action of any Friendly AI will be to create a nonprofit institute to develop a rigorous theory of Eliezer Yudkowsky. Unfortunately, it will turn out to be an intractable problem.

Comment author: Yvain 23 March 2009 10:42:15AM 47 points [-]

Transhuman AIs theorize that if they could create Eliezer Yudkowsky, it would lead to an "intelligence explosion".

Comment author: Anatoly_Vorobey 22 March 2009 09:28:23PM 30 points [-]

Robert Aumann has proven that ideal Bayesians cannot disagree with Eliezer Yudkowsky.

... because all of them are Eliezer Yudkowsky.

They call it "spontaneous symmetry breaking", because Eliezer Yudkowsky just felt like breaking something one day.

Particles in parallel universes interfere with each other all the time, but nobody interferes with Eliezer Yudkowsky.

An oracle for the Halting Problem is Eliezer Yudkowsky's cellphone number.

When tachyons get confused about their priors and posteriors, they ask Eliezer Yudkowsky for help.

Comment author: dclayh 25 March 2009 04:10:26AM 9 points [-]

Eliezer can in fact tile the Universe with himself, simply by slicing himself into finitely many pieces. The only reason the rest of us are here is quantum immortality.

Comment author: Liron 23 March 2009 09:10:08AM 11 points [-]

Angering Eliezer Yudkowsky is a global existential risk

Where's the punch line?

Comment author: dspeyer 06 June 2014 06:30:41AM 7 points [-]

Absence of 10^26 paperclips is evidence of Eliezer Yudkowsky

(From an actual Cards against Rationality game we played)

Comment author: jaibot 07 May 2012 01:39:41PM 29 points [-]

Eliezer Yudkowsky updates reality to fit his priors.

Comment author: PhilGoetz 23 March 2009 02:06:03AM *  47 points [-]
  • Omega one-boxes against Eliezer Yudkowsky.
  • If Michelson and Morley had lived A.Y., they would have found that the speed of light was relative to Eliezer Yudkowsky.
  • Turing machines are not Eliezer-complete.
  • The fact that the Bible contains errors doesn't prove there is no God. It just proves that God shouldn't try to play Eliezer Yudkowsky.
  • Eliezer Yudkowsky has measure 1.
  • Eliezer Yudkowsky doesn't wear glasses to see better. He wears glasses that distort his vision, to avoid violating the uncertainty principle.
Comment author: Normal_Anomaly 02 January 2011 02:58:25AM 12 points [-]

Eliezer Yudkowsky doesn't wear glasses to see better. He wears glasses that distort his vision, to avoid violating the uncertainty principle.

Eliezer Yudkowsky took his glasses off once. Now he calls it the certainty principle.

Comment author: army1987 23 May 2013 01:31:59PM 18 points [-]

Eliezer Yudkowsky mines bitcoins in his head.

Comment author: PeerInfinity 23 November 2010 10:54:57PM 26 points [-]

Eliezer Yudkowsky can make Chuck Norris shave his beard off by using text-only communication

(stolen from here)

Comment author: army1987 03 January 2012 01:07:42AM 5 points [-]

Now I'm too curious whether this would actually be true. Would the two of them test this if I paid them $50 each (plus an extra $10 for the winner)?

Comment author: gwern 03 January 2012 02:05:57AM 4 points [-]

$50 won't even get you in to talk to Norris. (Wouldn't do it even at his old charity martial arts things.) Maybe not Eliezer either. Norris is kept pretty darn busy in part due to his memetic status.

Comment author: army1987 03 January 2012 11:50:26AM *  2 points [-]

Well, I'd increase those figures by a few orders of magnitude ... if I had a few orders of magnitudes more money than I do now. :-)

Comment author: dfranke 22 March 2009 09:17:07PM 42 points [-]

Eliezer Yudkowsky can escape an AI box while wearing a straight jacket and submerged in a shark tank.

Comment author: Jotto999 12 August 2013 01:19:51PM 6 points [-]

I don't always think clearly, but when I do, I prefer Yudkowsky.

Stay friendly, AIs.

Comment author: Grognor 28 January 2012 12:13:26PM 16 points [-]

Eliezer Yudkowsky did the impossible for practice.

Comment author: orthonormal 24 March 2009 06:07:13PM 35 points [-]

Eliezer Yudkowsky knows exactly how best to respond to this thread; he's just left it as homework for us.

Comment author: timujin 15 January 2014 05:31:14PM *  14 points [-]

• Eliezer Yudkowsky uses blank territories for drafts.

• Just before this universe runs out of negentropy, Eliezer Yudkowsky will persuade the Dark Lords of the Matrix to let him out of the universe.

• Eliezer Yudkowsky signed up for cryonics to be revived when technologies are able to make him an immortal alicorn princess.

• Eliezer Yudkowsky's MBTI type is TTTT.

• Eliezer Yudkowsky's punch is the only way to kill a quantum immortal person, because he is guaranteed to punch him in all Everett branches.

• "Turns into an Eliezer Yudkowsky fact when preceded by its quotation" turns into an Eliezer Yudkowsky fact when preceded by its quotation.

• Lesser minds cause wavefunction collapse. Eliezer Yudkowsky's mind prevents it.

• Planet Earth is originally a mechanism designed by aliens to produce Eliezer Yudkowsky from sunlight.

• Real world doesn't make sense. This world is just Eliezer Yudkowsky's fanfic of it. With Eliezer Yudkowsky as a self-insert.

• When Eliezer Yudkowsky takes nootropics, the universe starts to lag from the lack of processing power.

• Eliezer Yudkowsky can kick your ass in an uncountably infinite number of counterfactual universes simultaneously.

Comment author: wedrifid 19 July 2014 06:57:29PM 2 points [-]

Eliezer Yudkowsky's MBTI type is TTTT.

Love it.

Eliezer Yudkowsky can kick your ass in an uncountably infinite number of counterfactual universes simultaneously.

This one seems to be true. True of Eliezer Yudkowsky and true of every other human living or dead (again simultaneously). "Uncountably infinite counterfactual universes" make most mathematically coherent tasks kind of trivial. This is actually a less impressive feat than, say, "Chuck Norris contains at least one water molecule".

Comment author: Wei_Dai 29 July 2010 05:47:48PM *  34 points [-]

We're all living in a figment of Eliezer Yudkowsky's imagination, which came into existence as he started contemplating the potential consequences of deleting a certain Less Wrong post.

Comment author: Larks 30 July 2010 11:04:51PM 6 points [-]

Interesting thought:

Assume that our world can't survive by itself, and that this world is destroyed as soon as Eliezer finishes contemplating.

Assume we don't value worlds other than those that diverge from the current one, or at least that we care mainly about that one, and that we care more about worlds or people in proportion to their similarity to ours.

In order to keep this world (or collection of multiple-worlds) running for as long as possible, we need to estimate the utility of the Not-Deleting worlds, and keep our total utility close enough to theirs that Eliezer isn't confident enough to decide either way.

As a second goal, we need to make this set of worlds have a higher utility than the others, so that if he does finish contemplating, he'll decide in favour of ours.

These are just the general characteristics of this sort of world (similar to some of Robin Hanson's thought). Obveously, this contemplation is a special case, and we're not going to explain the special consequences in public.

Comment author: SilasBarta 29 July 2010 06:06:28PM 3 points [-]

Wow! So the real world never had the PUA flamewar!

Comment author: Blueberry 30 July 2010 07:29:04AM *  7 points [-]

No, the PUA flamewar occurred in both worlds: this world just diverged from the real one a few days ago, after Roko made his post.

Comment author: NancyLebovitz 30 July 2010 12:36:44AM 3 points [-]

Eliezer may have a little more fondness for chaos than his non-fiction posts suggest.

Comment author: patrissimo 06 August 2011 03:44:52AM 20 points [-]

Eliezer Yudkowsky's keyboard only has two keys: 1 and 0.

Comment author: patrissimo 06 August 2011 03:44:37AM 20 points [-]

The speed of light used to be much lower before Eliezer Yudkowsky optimized the laws of physics.

Comment author: Liron 23 March 2009 09:08:15AM 54 points [-]

Eliezer two-boxes on Newcomb's problem, and both boxes contain money.

Comment author: SilasBarta 18 October 2010 06:22:12PM 62 points [-]

Eliezer Yudkowsky holds the honorary title of Duke Newcomb.

Comment author: ata 18 October 2010 09:34:59PM *  27 points [-]

Eliezer seals a cat in a box with a sample of radioactive material that has a 50% chance of decaying after an hour, and a device that releases poison gas if it detects radioactive decay. After an hour, he opens the box and there are two cats.

Comment author: Vivi 15 September 2011 10:59:09PM 1 point [-]

So Eliezer is simultaneously dead and alive?

Comment author: DanielLC 18 October 2010 05:30:53AM 21 points [-]

Eliezer three-boxes on Newcomb's problem.

Comment author: RobinZ 18 October 2010 03:22:55PM 17 points [-]

Eliezer Omegas on Newcomb's problem.

Comment author: SamE 13 July 2010 06:17:17AM *  17 points [-]
  • Eliezer Yudkowsky two-boxes on Newcomb's problem, and both boxes contain $1 million.
  • Eliezer Yudkowsky two-boxes on Newcomb's problem, and both boxes contain solid utils.
Comment author: Will_Newsome 10 March 2010 11:32:35PM 18 points [-]

Unlike Frodo, Eliezer Yudkowsky had no trouble throwing the Ring into the fires of Mount Foom.

Comment author: Benja 09 September 2012 10:32:56PM 16 points [-]

When Eliezer Yudkowsky once woke up as Britney Spears, he recorded the world's most-reviewed song about leveling up as a rationalist.

Eliezer Yudkowsky got Clippy to hold off on reprocessing the solar system by getting it hooked on HP:MoR, and is now writing more slowly in order to have more time to create FAI.

If you need to save the world, you don't give yourself a handicap; you use every tool at your disposal, and you make your job as easy as you possibly can. That said, it is true that Eliezer Yudkowsky once saved the world using nothing but modal logic and a bag of suggestively-named Lisp tokens.

Eliezer Yudkowsky once attended a conference organized by some above-average Powers from the Transcend that were clueful enough to think "Let's invite Eliezer Yudkowsky"; but after a while he gave up and left before the conference was over, because he kept thinking "What am I even doing here?"

Eliezer Yudkowsky has invested specific effort into the awful possibility that one day, he might create an Artificial Intelligence so much smarter than him that after he tells it the basics, it will blaze right past him, solve the problems that have weighed on him for years, and zip off to see humanity safely through the Singularity. It might happen, it might not. But he consoles himself with the fact that it hasn't happened yet.

Eliezer Yudkowsky once wrote a piece of rationalist Harry Potter fanfiction so amazing that it got multiple people to actually change their lives in an effort at being more rational. (...hm'kay, perhaps that's not quite awesome enough to be on this list... but you've got to admit that it's in the neighbourhood.)

Comment author: Benja 04 November 2012 06:55:40PM *  4 points [-]

When Eliezer Yudkowsky does the incredulous stare, it becomes a valid argument.

Comment author: Will_Newsome 07 March 2010 09:24:22PM 25 points [-]

Some people can perform surgery to save kittens. Eliezer Yudkowsky can perform counterfactual surgery to save kittens before they're even in danger.

Comment author: kboon 28 October 2010 04:21:55PM *  20 points [-]

Xkcd's Randall Munroe once counted to zero, from both positive, and negative infinity which was no mean feat. Not to be outdone, Eliezer Yudkowsky counted the real numbers between zero and one.

Comment author: lukstafi 28 January 2011 03:54:14AM *  4 points [-]

You probably live in a simulation, unless you know Eliezer Yudkowsky in real life.

Comment author: wedrifid 28 January 2011 03:56:04AM 13 points [-]

I would expect Eliezer Yudkowsky to be slightly more likely to simulate people he does know in real life.

Comment author: Larks 11 April 2011 12:14:00AM 4 points [-]

And other people to want to simulate Eliezer

Comment author: avalot 19 October 2010 04:22:58PM 34 points [-]
  • The sound of one hand clapping is "Eliezer Yudkowsky, Eliezer Yudkowsky, Eliezer Yudkowsky..."
  • Eliezer Yudkowsky displays search results before you type.
  • Eliezer Yudkowsky's name can't be abbreviated. It must take up most of your tweet.
  • Eliezer Yudkowsky doesn't actually exist. All his posts were written by an American man with the same name.
  • If Eliezer Yudkowsky falls in the forest, and nobody's there to hear him, he still makes a sound.
  • Eliezer Yudkowsky doesn't believe in the divine, because he's never had the experience of discovering Eliezer Yudkowsky.
  • "Eliezer Yudkowsky" is a sacred mantra you can chant over and over again to impress your friends and neighbors, without having to actually understand and apply rationality in your life. Nifty!
Comment author: timujin 14 January 2014 10:27:27AM 2 points [-]

The last one actually works!

Comment author: Meni_Rosenfeld 18 October 2010 07:23:44PM *  14 points [-]

When Eliezer Yudkowsky divides by zero, he gets a singularity.

Comment author: ata 09 September 2010 01:52:11AM 14 points [-]

If giants have been able to see further than others, it is because they have stood on the shoulders of Eliezer Yudkowsky.

Comment author: Technologos 23 March 2009 04:05:31AM *  32 points [-]
  • Eliezer Yudkowsky has counted to Aleph 3^^^3.
  • The payoff to defection in the Prisoner's Dilemma against Eliezer Yudkowsky is a paperclip. In the eye.
  • The Peano axioms are complete and consistent for Eliezer Yudkowsky.
  • In an Iterated Prisoner's Dilemma between Chuck Norris and Eliezer Yudkowsky, Chuck always cooperates and Eliezer defects. Chuck knows not to mess with his superiors.
  • Eliezer Yudkowsky's brain actually exists in a Hilbert space.
  • In Japan, it is common to hear the phrase Eliezer naritai!
Comment author: Jonii 12 August 2010 05:24:38PM 13 points [-]

Eliezer naritai!

Kinda irrelevant, but this should be "Eliezer ni naritai", since omitting "ni" is only for some rare Japanese adjectives, rite?

Comment author: SpaceFrank 28 February 2012 02:14:11PM 9 points [-]

ph'nglui mglw'nafh Eliezer Yudkowsky Clinton Township wgah'nagl fhtagn

Doesn't really roll off the tongue, does it.

(http://en.wikipedia.org/wiki/Cryonics_Institute)

Comment author: JGWeissman 07 March 2010 09:59:12PM 13 points [-]

Eliezer Yudkowsky once explained:

To answer precisely, you must use beliefs like Earth's gravity is 9.8 meters per second per second, and This building is around 120 meters tall. These beliefs are not wordless anticipations of a sensory experience; they are verbal-ish, propositional. It probably does not exaggerate much to describe these two beliefs as sentences made out of words. But these two beliefs have an inferential consequence that is a direct sensory anticipation - if the clock's second hand is on the 12 numeral when you drop the ball, you anticipate seeing it on the 5 numeral when you hear the crash.

Experiments conducted near the building in question determined the local speed of sound to be 6 meters per second.

(Hat Tip)

Comment author: steven0461 23 March 2009 02:46:26PM 19 points [-]

Snow is white if and only if that's what Eliezer Yudkowsky wants to believe.

Comment author: [deleted] 18 October 2009 02:19:21AM 24 points [-]

Ironically, this is mathematically true. (Assuming Eliezer hasn't forsaken epistemic rationality, that is.) It's just that if Eliezer changes what he wants to believe, the color of snow won't change to reflect it.

Comment author: JohannesDahlstrom 01 April 2010 09:01:25AM 7 points [-]

It's just that if Eliezer changes what he wants to believe, the color of snow won't change to reflect it.

What?! Blasphemy!

Comment author: DanielLC 18 October 2010 05:34:28AM 17 points [-]

No, it's also mathematically true. He won't change what he wants to believe.

Comment author: thomblake 18 May 2010 03:29:35PM 9 points [-]
Comment author: Will_Newsome 10 March 2010 08:36:56AM *  9 points [-]

Eliezer Yudkowsky can slay Omega with two suicide rocks and a sling.

Comment author: CronoDAS 24 March 2009 03:11:15AM 9 points [-]

Eliezer Yudkowski can solve NP complete problems in polynomial time.

Comment author: Meni_Rosenfeld 19 October 2010 03:21:11PM 7 points [-]

Eliezer Yudkowski can solve EXPTIME-complete problems in polynomial time.

Comment author: Will_Newsome 31 October 2010 04:07:47PM *  29 points [-]

I think Less Wrong is a pretty cool guy. eh writes Hary Potter fanfic and doesnt afraid of acausal blackmails.

Comment author: roland 24 March 2009 02:32:44AM *  13 points [-]
  • When Eliezer Yudkowsky wakes up in the morning he asks himself: why do I believe that I'm Eliezer Yudkowsky?
Comment author: gwern 24 March 2009 12:42:57AM 14 points [-]

Eliezer Yudkowsky's favorite food is printouts of Rice's theorem.

This isn't bad, but I think it can be better. Here's my try:

You eat Rice Krispies for breakfast; Eliezer Yudkowsky eats Rice theorems.

Comment author: therufs 27 February 2013 05:49:11PM 9 points [-]

Eliezer Yudkowsky's Patronus is Harry Potter.

Comment author: Larks 27 February 2013 06:26:58PM 11 points [-]

Eliezer Yudkowsky is his own Patronus.

Comment author: Konkvistador 09 September 2012 08:10:36AM *  15 points [-]

Eliezer's approval makes actions tautologically non-abusive.

Comment author: James_Miller 16 June 2009 06:38:26PM *  12 points [-]

The Busy Beaver function was created to quantify Eliezer Yudkowsky 's IQ.

Comment author: RolfAndreassen 07 September 2009 08:00:44PM 12 points [-]

You do not really know anything about Eliezer Yudkowsky until you can build one from rubber bands and paperclips. Unfortunately, doing so would require that you first transform all matter in the Universe into paperclips and rubber bands, otherwise you will not have sufficient raw materials. Consequently, if you are ignorant about Eliezer Ydkowsky (which has just been shown), this is a statement about Eliezer Yudkowsky, not about your state of knowledge.

Comment author: chaosmosis 17 July 2012 01:45:16AM 7 points [-]

If you see Eliezer Yudkowsky on the road, do not kill him.

Comment author: Document 17 July 2012 04:37:51PM 10 points [-]

If you see Eliezer Yudkowsky on the road, Pascal's-mug him.

Comment author: gwern 17 July 2012 03:41:12PM 6 points [-]

If you meet the Eliezer on the road, cryopreserve it!

Comment author: roland 02 November 2012 10:30:33PM 4 points [-]
Comment author: lockeandkeynes 16 September 2010 05:03:49AM 4 points [-]

Eliezer Yudkowsky only drinks from Klein Bottles.

Comment author: shware 19 July 2014 02:14:04PM 3 points [-]

I feel this should not be in featured posts, as amusing as it was at the time

Comment author: [deleted] 28 February 2012 12:07:23AM 3 points [-]

question: What is your verdict on my observation that the jokes on this page would be less hilarious if they used only Eliezer's first name instead of the full 'Eliezer Yudkowsky'?

I speculate that some of the humor derives from using the full name — perhaps because of how it sounds, or because of the repetition, or even simply because of the length of the name.

Comment author: Sarokrae 13 September 2012 03:18:47PM *  5 points [-]

The consonant "k" is funny, according to I think something Richard Wiseman once wrote...

Comment author: army1987 28 February 2012 01:09:45AM *  4 points [-]

...or even because it pattern-matches Chuck Norris jokes, which use the actor's full name.

ETA: On the other hand, Yudkowsky alone does have the same number of syllables and stress pattern as Chuck Norris, and the sheer length of the full name does contribute to the effect of this IMO.

Comment author: roland 09 December 2011 07:21:50PM 5 points [-]

A russian pharmacological company was trying to make a drug against stupidity with the name of "EliminateStupodsky", the result was Eliezer Yudkowsky.

Comment author: J_Taylor 09 December 2011 07:26:07PM *  45 points [-]

When I read part of this in Recent Comments, I was almost entirely sure this comment would be spam. This is probably one of the few legit comments ever made which began with "A russian pharmacological company."

Comment author: Rune 22 March 2009 10:34:39PM 5 points [-]
  • Eliezer Yudkowsky can isolate magnetic monopoles; he gives them to small orphan children as birthday presents.
  • Eliezer Yudkowsky once challenged God to a contest to see who knew the most about physics. Eliezer Yudkowsky won and disproved God.
  • Eliezer Yudkowsky once checkmated Kasparov in seven moves — while playing Monopoly.
  • At the age of eight, Eliezer Yudkowsky built a fully functional AGI out of LEGO.
  • Eliezer Yudkowsky never includes error estimates in his experimental write ups: his results are always exact by definition.
  • When foxes have a good idea they say it is "as cunning as Eliezer Yudkowsky".
  • Apple pays Eliezer Yudkowsky 99 cents every time he listens to a song.
  • Eliezer Yudkowsky can kill two stones with one bird.
  • When the Boogeyman goes to sleep every night, he checks his closet for Eliezer Yudkowsky.
  • Eliezer Yudkowsky can derive the Axiom of Choice from ZF Set Theory.
Comment author: bogdanb 11 January 2013 09:08:09AM 11 points [-]

At the age of eight, Eliezer Yudkowsky built a fully functional AGI out of LEGO. It's still fooming, just very, very slowly.

Comment author: PhilGoetz 23 March 2009 03:33:25AM 15 points [-]

These are funny. But some are from a website about Chuck Norris! Don't incite Chuck's wrath against Eliezer.

If Chuck Norris and Eliezer ever got into a fight in just one world, it would destroy all possible worlds. Fortunately there are no possible worlds in which Eliezer lets this happen.

Comment author: Yvain 23 March 2009 10:43:10AM 48 points [-]

All problems can be solved with Bayesian logic and expected utility. "Bayesian logic" and "expected utility" are the names of Eliezer Yudkowsky's fists.

Comment author: PaulWright 23 March 2009 02:00:14AM 6 points [-]

Eliezer Yudkowsky can isolate magnetic monopoles

Nah, that's Dave Green. You'd better hope Dr Green doesn't find out...

Comment author: [deleted] 19 October 2010 06:18:37PM 4 points [-]

There is no chin behind Eliezer Yudkowsky's beard. There is only another brain.

Comment author: [deleted] 25 February 2013 10:01:48AM 3 points [-]

Posts like this reinforce the suspicion that LessWrong is a personality cult.

Comment author: Fadeway 25 February 2013 11:11:28AM 13 points [-]

I disagree. This entire thread is so obviously a joke, one could only take it as evidence if they've already decided what they want to believe and are just looking for arguments.

It does show that EY is a popular figure around here, since nobody goes around starting Chuck Norris threads about random people, but that's hardly evidence for a cult. Hell, in the case of Norris himself, it's the opposite.

Comment author: IlyaShpitser 25 February 2013 11:46:10AM *  4 points [-]

http://www.overcomingbias.com/2011/01/how-good-are-laughs.html

http://www.overcomingbias.com/2010/07/laughter.html

I find these "jokes" pretty creepy myself. The facts about Chuck Norris is that he's a washed up actor selling exercise equipment. I think Chuck Norris jokes/stories are a modern internet version of Paul Bunyan stories in American folklore or bogatyr stories in Russian folklore. There is danger here -- I don't think these stories are about humor.

Comment author: patrissimo 06 August 2011 03:44:11AM 3 points [-]

Eliezer Yudkowsky doesn't have a chin, underneath his beard is another brain.

Comment author: xamdam 29 July 2010 07:41:23PM 3 points [-]
  • The problem with CEV is that the coherence requirement will force it to equal to whatever Eliezer wants in the limit.
Comment author: army1987 04 December 2013 06:51:27PM 2 points [-]

Eliezer Yudkowsky once brought peace to the Middle East from inside a freight container, through a straw.

This one doesn't sound particularly EY-related to me; it might as well be Chuck Norris.

Comment author: David_Gerard 20 December 2013 11:47:43AM 8 points [-]

It's an AI-Box joke.

Comment author: insaneabd 14 April 2009 08:07:11PM 2 points [-]

An Eliezer Yudkowsky article a day keeps irrationality away.

Comment author: roland 09 December 2011 07:18:35PM 7 points [-]

Slight improvement?

An Eliezer Yudkowsky article a day keeps irrationality at bay.

Comment author: woodside 17 July 2012 05:19:09PM 3 points [-]

An Eliezer Yudkowsky post a day keeps the bias at bay.

Comment author: SilasBarta 17 June 2009 01:54:12AM 2 points [-]

There is no "time", just events Eliezer Yudkowsky has felt like allowing.

Comment author: lukeprog 10 April 2011 11:59:31PM 1 point [-]

Oh my God this is such a great thread.

Comment author: Annoyance 23 March 2009 02:50:45PM -1 points [-]

Why is this posted to LessWrong?

What does it have to do with being less wrong or sharpening our rationality?

Comment author: BenRayfield 31 December 2009 08:40:07PM 12 points [-]

We are Borg. You will be assimilated. Resistance is futile. If Star Trek's Borg Collective came to assimilate everyone on Earth, Eliezer Yudkowsky would engage them in logical debate until they agreed they should come back later after our technology has increased exponentially for some number of years, a more valuable thing for them to assimilate. Also, he would underestimate how fast our technology increases just enough that when the Borg came back, we would be the stronger force.

Why is this posted to LessWrong? What does it have to do with being less wrong or sharpening our rationality?

Rational minds need comedy too, or they go insane. Much of this is vaguely related to rational subjects so it does not fit well in other websites.

Comment author: DanArmak 31 December 2009 08:58:06PM *  4 points [-]

Rational minds need comedy too, or they go insane.

Not necessarily. It's just that we are very far from being perfectly rational.

Comment author: BenRayfield 01 January 2010 01:41:40AM 3 points [-]

Not necessarily. It's just that we are very far from being perfectly rational.

You're right. I wrote "rational minds" in general when I was thinking about the most rational few of people today. I did not mean any perfectly rational mind exists.

Most or all Human brains tend to work better if they experience certain kinds of things that may include wasteful parts, like comedy, socializing, and dreaming. Its not rational to waste more than you have to. Today we do not have enough knowledge and control over our minds to optimize away all our wasteful/suboptimal thoughts.

I have no reason to think, in the "design space" of all possible minds, there exists 0, or there exists more than 0, perfectly rational minds that tend to think more efficiently after experiencing comedy.

I do have a reason to slightly bias it toward "there exists more than 0" because Humans and monkeys have a sense of humor that helps them think better if used at least once per day, but when thinking about exponential size intelligence, that slight bias becomes an epsilon. Epsilon can be important if you're completely undecided, but usually its best to look for ideas somewhere else before considering an epsilon size chance. What people normally call "smarter than Human intelligence" is also an epsilon size intelligence in this context, so the 2 things are not epsilon when compared to eachother.

The main thing I've figured out here is to be more open-minded about if comedy (and similar things) can increase the efficiency of a rational mind or not. I let an assumption get into my writing.

Comment author: steven0461 23 March 2009 02:55:29PM *  1 point [-]

Not a lot, I guess. I had part of it lying around as an old blog post draft and it seemed fitting given recent discussions.

Comment author: mfb 11 February 2012 09:01:55PM 1 point [-]

Based on my utility function, it gives me utils to read this.

Eliezer Yudkowsky does not decide rational between multiple options. He takes all options in parallel.

Comment author: BrentAllsop 24 July 2009 01:36:32AM 0 points [-]

Eliezer Yudkowsky doesn't fear unfriendly AI, he just wants everyone else to fear them.

Comment author: steven0461 22 March 2009 08:51:25PM *  1 point [-]

If it's apparently THAT bad an idea (and/or execution), is it considered bad form to just delete the whole thing?

(edit: this post now obsolete; thanks, all)

Comment author: gjm 22 March 2009 09:23:58PM 23 points [-]

Add me to the list of people who thought it was laugh-out-loud funny. I'm glad this sort of thing doesn't make up a large fraction of LW articles but please, no, don't delete it.

Comment author: CarlShulman 22 March 2009 09:02:29PM 9 points [-]

I laughed out loud, and I'd say keep it but don't promote it..

Comment author: Z_M_Davis 22 March 2009 08:53:45PM 6 points [-]

Leave it up! This is hilarious; thank you!

Comment author: pjeby 22 March 2009 09:03:01PM 28 points [-]

I agree as well.

ObEliezerFact: Eliezer Yudkowsky didn't run away from grade school... grade school ran away from Eliezer Yudkowsky.

Comment author: ciphergoth 22 March 2009 10:52:15PM 4 points [-]

No, I like this game! Nearly all the ones up to and including one-boxing are giggle-out-loud funny, and there are some gems after that too.

Comment author: Will_Newsome 16 April 2010 12:29:23AM 1 point [-]

Eliezer Yudkowsky is a superstimulus for perfection.

Comment author: DSimon 06 August 2011 04:57:27AM *  -1 points [-]

Eliezer Yudkowsky already knows how to shot web.