Awww, a Zebra

23 Eliezer_Yudkowsky 01 October 2008 01:28AM

This image recently showed up on Flickr (original is nicer):

Zebra_4

With the caption:

"Alas for those who turn their eyes from zebras and dream of dragons!  If we cannot learn to take joy in the merely real, our lives shall be empty indeed." —Eliezer S. Yudkowsky.

"Awww!", I said, and called over my girlfriend over to look.

"Awww!", she said, and then looked at me, and said,  "I think you need to take your own advice!"

Me:  "But I'm looking at the zebra!"
Her:  "On a computer!"
Me:  (Turns away, hides face.)
Her:  "Have you ever even seen a zebra in real life?"
Me:  "Yes!  Yes, I have!  My parents took me to Lincoln Park Zoo!  ...man, I hated that place."

 

Part of the Joy in the Merely Real subsequence of Reductionism

Next post: "Hand vs. Fingers"

Previous post: "Initiation Ceremony"

Fake Norms, or "Truth" vs. Truth

16 Eliezer_Yudkowsky 22 July 2008 10:23AM

Followup toApplause Lights

When you say the word "truth", people know that "truth" is a good thing, and that they're supposed to applaud.  So it might seem like there is a social norm in favor of "truth".  But when it comes to some particular truth, like whether God exists, or how likely their startup is to thrive, people will say:  "I just want to believe" or "you've got to be optimistic to succeed".

So Robin and I were talking about this, and Robin asked me how it is that people prevent themselves from noticing the conflict.

I replied that I don't think active prevention is required.  First, as I quoted Michael Vassar:

"It seems to me that much of the frustration in my life prior to a few years ago has been due to thinking that all other human minds necessarily and consistently implement modus ponens."

But more importantly, I don't think there does exist any social norm in favor of truth.  There's a social norm in favor of "truth".  There's a difference.

continue reading »

The Fear of Common Knowledge

21 Eliezer_Yudkowsky 09 July 2008 09:48AM

Followup toBelief in Belief

One of those insights that made me sit upright and say "Aha!"  From The Uncredible Hallq:

Minor acts of dishonesty are integral to human life, ranging from how we deal with casual acquaintances to writing formal agreements between nation states.  Steven Pinker has an excellent chapter on this in The Stuff of Thought, a version of which can be found at TIME magazine’s website. What didn’t make it into the TIME version is Pinker’s proposal that, while there are several reasons we do this, the most important reason is to avoid mutual knowledge:  "She probably knows I just blew a pass at her, but does she know I know she knows? Does she know I know she knows I know she knows?"  Etc.  Mutual knowledge is that nightmare where, for all intents and purposes, the known-knows can be extended out to infinity.  The ultimate example of this has to be the joke "No, it wasn’t awkward until you said, 'well, this is awkward.'"  A situation might be a little awkward, but what’s really awkward is mutual knowledge, created when someone blurts out what’s going on for all to hear...

The story of the Emperor’s New Clothes is another example of the power of mutual knowledge...

continue reading »

Fake Optimization Criteria

30 Eliezer_Yudkowsky 10 November 2007 12:10AM

Followup to:  Fake Justification, The Tragedy of Group Selectionism

I've previously dwelt in considerable length upon forms of rationalization whereby our beliefs appear to match the evidence much more strongly than they actually do.  And I'm not overemphasizing the point, either.  If we could beat this fundamental metabias and see what every hypothesis really predicted, we would be able to recover from almost any other error of fact.

The mirror challenge for decision theory is seeing which option a choice criterion really endorses.  If your stated moral principles call for you to provide laptops to everyone, does that really endorse buying a $1 million gem-studded laptop for yourself, or spending the same money on shipping 5000 OLPCs?

We seem to have evolved a knack for arguing that practically any goal implies practically any action.  A phlogiston theorist explaining why magnesium gains weight when burned has nothing on an Inquisitor explaining why God's infinite love for all His children requires burning some of them at the stake.

There's no mystery about this.  Politics was a feature of the ancestral environment.  We are descended from those who argued most persuasively that the good of the tribe meant executing their hated rival Uglak.  (We sure ain't descended from Uglak.) 

continue reading »

Beware of Stephen J. Gould

27 Eliezer_Yudkowsky 06 November 2007 05:22AM

Followup to:  Natural Selection's Speed Limit and Complexity Bound

If you've read anything Stephen J. Gould has ever said about evolutionary biology, I have some bad news for you.  In the field of evolutionary biology at large, Gould's reputation is mud.  Not because he was wrong.  Many honest scientists have made honest mistakes.  What Gould did was much worse, involving deliberate misrepresentation of science.

In his 1996 book Full House: The Spread of Excellence from Plato to Darwin, Stephen J. Gould explains how modern evolutionary biology is very naive about evolutionary progress.  Foolish evolutionary biologists, says Gould, believe that evolution has a preferred tendency toward progress and the accumulation of complexity.  But of course - Gould kindly explains - this is simply a statistical illusion, bolstered by the tendency to cite hand-picked sequences like bacteria, fern, dinosaurs, dog, man.  You could equally well explain this apparent progress by supposing that evolution is undergoing a random walk, sometimes losing complexity and sometimes gaining it.  If so, Gould says, there will be a left bound, a minimum at zero complexity, but no right bound, and the most complex organisms will seem to grow more complex over time.  Even though it's really just a random walk with no preference in either direction, the distribution widens and the tail gets longer.

What romantics, ha ha, those silly evolutionary biologists, believing in progress!  It's a good thing we had a statistically sophisticated thinker like Stephen J. Gould to keep their misconceptions from infecting the general public.  Indeed, Stephen J. Gould was a hero - a martyr - because evolutionary biologists don't like it when you challenge their romantic preconceptions, and they persecuted him.  Or so Gould represented himself to the public.

There's just one problem:  It's extremely unlikely that any modern evolutionary theorist, however much a romantic, would believe that evolution was accumulating complexity.

continue reading »

The "Outside the Box" Box

33 Eliezer_Yudkowsky 12 October 2007 10:50PM

Whenever someone exhorts you to "think outside the box", they usually, for your convenience, point out exactly where "outside the box" is located.  Isn't it funny how nonconformists all dress the same...

In Artificial Intelligence, everyone outside the field has a cached result for brilliant new revolutionary AI idea—neural networks, which work just like the human brain!  New AI Idea: complete the pattern:  "Logical AIs, despite all the big promises, have failed to provide real intelligence for decades—what we need are neural networks!"

This cached thought has been around for three decades.  Still no general intelligence.  But, somehow, everyone outside the field knows that neural networks are the Dominant-Paradigm-Overthrowing New Idea, ever since backpropagation was invented in the 1970s.  Talk about your aging hippies.

Nonconformist images, by their nature, permit no departure from the norm.  If you don't wear black, how will people know you're a tortured artist?  How will people recognize uniqueness if you don't fit the standard pattern for what uniqueness is supposed to look like?  How will anyone recognize you've got a revolutionary AI concept, if it's not about neural networks?

continue reading »

A Rational Argument

39 Eliezer_Yudkowsky 02 October 2007 06:35PM

Followup toThe Bottom Line, Rationalization

You are, by occupation, a campaign manager, and you've just been hired by Mortimer Q. Snodgrass, the Green candidate for Mayor of Hadleyburg.  As a campaign manager reading a blog on rationality, one question lies foremost on your mind:  "How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?"

Sorry.  It can't be done.

"What?" you cry.  "But what if I use only valid support to construct my structure of reason?  What if every fact I cite is true to the best of my knowledge, and relevant evidence under Bayes's Rule?"

Sorry.  It still can't be done.  You defeated yourself the instant you specified your argument's conclusion in advance.

continue reading »

Human Evil and Muddled Thinking

40 Eliezer_Yudkowsky 13 September 2007 11:43PM

Followup toRationality and the English Language

George Orwell saw the descent of the civilized world into totalitarianism, the conversion or corruption of one country after another; the boot stamping on a human face, forever, and remember that it is forever.  You were born too late to remember a time when the rise of totalitarianism seemed unstoppable, when one country after another fell to secret police and the thunderous knock at midnight, while the professors of free universities hailed the Soviet Union's purges as progress.  It feels as alien to you as fiction; it is hard for you to take seriously.  Because, in your branch of time, the Berlin Wall fell.  And if Orwell's name is not carved into one of those stones, it should be.

Orwell saw the destiny of the human species, and he put forth a convulsive effort to wrench it off its path.  Orwell's weapon was clear writing.  Orwell knew that muddled language is muddled thinking; he knew that human evil and muddled thinking intertwine like conjugate strands of DNA:

In our time, political speech and writing are largely the defence of the indefensible. Things like the continuance of British rule in India, the Russian purges and deportations, the dropping of the atom bombs on Japan, can indeed be defended, but only by arguments which are too brutal for most people to face, and which do not square with the professed aims of the political parties. Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenceless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called pacification...

continue reading »

Rationality and the English Language

35 Eliezer_Yudkowsky 12 September 2007 10:55PM

Yesterday, someone said that my writing reminded them of George Orwell's Politics and the English Language.  I was honored.  Especially since I'd already thought of today's topic.

If you really want an artist's perspective on rationality, then read Orwell; he is mandatory reading for rationalists as well as authors.  Orwell was not a scientist, but a writer; his tools were not numbers, but words; his adversary was not Nature, but human evil.  If you wish to imprison people for years without trial, you must think of some other way to say it than "I'm going to imprison Mr. Jennings for years without trial."  You must muddy the listener's thinking, prevent clear images from outraging conscience.  You say, "Unreliable elements were subjected to an alternative justice process."

Orwell was the outraged opponent of totalitarianism and the muddy thinking in which evil cloaks itself—which is how Orwell's writings on language ended up as classic rationalist documents on a level with Feynman, Sagan, or Dawkins.

continue reading »

Applause Lights

88 Eliezer_Yudkowsky 11 September 2007 06:31PM

Followup toSemantic Stopsigns, We Don't Really Want Your Participation

At the Singularity Summit 2007, one of the speakers called for democratic, multinational development of AI.  So I stepped up to the microphone and asked:

Suppose that a group of democratic republics form a consortium to develop AI, and there's a lot of politicking during the process—some interest groups have unusually large influence, others get shafted—in other words, the result looks just like the products of modern democracies.  Alternatively, suppose a group of rebel nerds develops an AI in their basement, and instructs the AI to poll everyone in the world—dropping cellphones to anyone who doesn't have them—and do whatever the majority says.  Which of these do you think is more "democratic", and would you feel safe with either?

I wanted to find out whether he believed in the pragmatic adequacy of the democratic political process, or if he believed in the moral rightness of voting.  But the speaker replied:

The first scenario sounds like an editorial in Reason magazine, and the second sounds like a Hollywood movie plot.

Confused, I asked:

Then what kind of democratic process did you have in mind?

continue reading »

View more: Prev | Next