Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

A Transhumanist Poem

11 Swimmer963 05 March 2011 09:16AM

**Note: I'm not a poet. I hardly ever write poetry, and when I do, it's usually because I've stayed up all night. However, this seemed like a very appropriate poem for Less Wrong. Not sure if it's appropriate as a top-level post. Someone please tell me if not.**

 

Imagine

The first man

Who held a stick in rough hands

And drew lines on a cold stone wall

Imagine when the others looked

When they said, I see the antelope

I see it. 

 

Later on their children's children

Would build temples, and sing songs

To their many-faced gods.

Stone idols, empty staring eyes

Offerings laid on a cold stone altar

And left to rot. 

 

Yet later still there would be steamships

And trains, and numbers to measure the stars

Small suns ignited in the desert

One man's first step on an airless plain

 

Now we look backwards

At the ones who came before us

Who lived, and swiftly died. 

The first man's flesh is in all of us now

And for his and his children's sake

We imagine a world with no more death

And we see ourselves reflected

In the silicon eyes

Of our final creation

September 2010 Southern California Meetup

10 JenniferRM 13 September 2010 02:31AM

The second LessWrong meetup for Southern California will happen on Saturday September 25th, 2010!  The meetup will start at 1PM and probably run for 4 or 5 hours in the Platt Campus Center of Harvey Mudd College.  Thanks are due to the Harvey Mudd Future Tech Club for the location.

We will be meeting in a conference room with tables, chairs, whiteboards, and a projector.  Based on the July SoCal LW Meetup we discovered that the most interesting thing was to have small group conversations but that turned out to be tricky in a pub.  This time we're optimizing for conversation, but pizza will almost assuredly be ordered and there will be free cookies and soda!

continue reading »

Transhumanism and the denotation-connotation gap

19 PhilGoetz 18 August 2010 03:33PM

A word's denotation is our conscious definition of it.  You can think of this as the set of things in the world with membership in the category defined by that word; or as a set of rules defining such a set.  (Logicians call the former the category's extension into the world.)

A word's connotation can mean the emotional coloring of the word.  AI geeks may think of it as a set of pairs, of other concepts that get activated or inhibited by that word, and the changes to the odds of recalling each of those concepts.

When we think analytically about a word - for instance, when writing legislation - we use its denotation.  But when we are in values/judgement mode - for instance, when deciding what to legislate about, or when voting - we use its denotation less and its connotation more.

This denotative-connotative gap can cause people to behave less rationally when they become more rational.  People who think and act emotionally are at least consistent.  Train them to think analytically, and they will choose goals using connotation but pursue them using denotation.  That's like hiring a Russian speaker to manage your affairs because he's smarter than you, but you have to give him instructions via Google translate.  Not always a win.

Consider the word "human".  It has wonderful connotations, to humans.  Human nature, humane treatment, the human condition, what it means to be human.  Often the connotations are normative rather than descriptive; behaviors we call "inhumane" are done only by humans.  The denotation is bare by comparison:  Featherless biped.  Homo sapiens, as defined by 3 billion base pairs of DNA.

continue reading »

Shortness is now a treatable condition

9 taw 20 October 2009 01:13AM

There was some talk here about height taxes, but there's a better solution - redefine shortness as a treatable condition and use HGH to cure it. They even got FDA on board with that, at least for 1.2% shortest people.

Unsatisfactory sexual performance became a treatable condition with Viagra. Depression and hyperactivity became treatable conditions with SSRIs. Being ugly is already almost considered a treatable condition, at least one can get that impression from cosmetic surgery ads. Being overweight is universally considered an illness, even though we don't have too many effective treatment options (surgery is unpopular, and effective drugs like fen-phen and ECA are not officially prescribed any more). If we ever figure out how to increase IQ, you can be certain low IQ will be considered a treatable condition too. Almost everything undesirable gets redefined as an illness as soon as an effective way to fix it is developed.

I welcome these changes. Yes, redefining large parts of normal human variability as illness is a lie, but if that's what society needs to work around its taboos against human enhancement, so be it.

You Only Live Twice

66 Eliezer_Yudkowsky 12 December 2008 07:14PM

"It just so happens that your friend here is only mostly dead.  There's a big difference between mostly dead and all dead."
        -- The Princess Bride

My co-blogger Robin and I may disagree on how fast an AI can improve itself, but we agree on an issue that seems much simpler to us than that:  At the point where the current legal and medical system gives up on a patient, they aren't really dead.

Robin has already said much of what needs saying, but a few more points:

Ben Best's Cryonics FAQ, Alcor's FAQ, Alcor FAQ for scientists, Scientists' Open Letter on Cryonics

• I know more people who are planning to sign up for cryonics Real Soon Now than people who have actually signed up.  I expect that more people have died while cryocrastinating than have actually been cryopreserved.  If you've already decided this is a good idea, but you "haven't gotten around to it", sign up for cryonics NOW.  I mean RIGHT NOW.  Go to the website of Alcor or the Cryonics Institute and follow the instructions.

continue reading »

View more: Next