The Singularity in the Zeitgeist

6 dclayh 02 October 2010 06:51AM

As a part of public relations, I think it's important to keep tabs on how the Singularity and related topics (GAI, FAI, life-extension, etc.) are presented in the culture at large.  I've posted links to such things in the past, but I think there should be a central clearinghouse, and a discussion-level post seems like the right place. 

So: in the comments, post examples of references to Singularity-related topics that you've found, ideally with a link and a few sentences' description of what the connection is and how it's presented (whether seriously or as an object of ridicule, for instance). 

 

There should probably be a similar post for rationality references, but let's see how this one goes first.

Harry Potter and the Methods of Rationality discussion thread, part 2

13 dclayh 01 August 2010 10:58PM

ETA: There is now a third thread, so send new comments there.

 

Since the first thread has exceeded 500 comments, it seems time for a new one, with Eliezer's just-posted Chapter 33 & 34 to kick things off. 

From previous post: 

Spoiler Warning:  this thread contains unrot13'd spoilers for Harry Potter and the Methods of Rationality up to the current chapter and for the original Harry Potter series.  Please continue to use rot13 for spoilers to other works of fiction, or if you have insider knowledge of future chapters of Harry Potter and the Methods of Rationality.

A suggestion: mention at the top of your comment which chapter you're commenting on, or what chapter you're up to, so that people can understand the context of your comment even after more chapters have been posted.  This can also help people avoid reading spoilers for a new chapter before they realize that there is a new chapter.

Loleliezers

5 dclayh 01 April 2010 04:04AM

Previously: Eliezer Yudkowsky facts, and Kevin's prediction.

 

A bit of silliness for the day.  Below the fold to spare those with delicate sensibilities. 

continue reading »

Comic about the Singularity

2 dclayh 14 January 2010 06:20PM

Today's Saturday Morning Breakfast Cereal.  (Which incidentally is a very funny webcomic I read regularly.)  Mouseover the red button for a bonus panel.

Clearly the author hasn't read the proper Eliezer essay(s) on post-Singularity life.

The utility curve of the human population

5 dclayh 24 September 2009 09:00PM

"Whoever saves a single life, it is as if he had saved the whole world."

  —The Talmud, Sanhedrin 4:5

That was the epigraph Eliezer used on a perfectly nice post reminding us to shut up and multiply when valuing human lives, rather than relying on the (roughly) logarithmic amount of warm fuzzies we'd receive.  Implicit in the expected utility calculation is the idea that the value of human lives scales linearly: indeed, Eliezer explicitly says, "I agree that one human life is of unimaginably high value. I also hold that two human lives are twice as unimaginably valuable."

However, in a comment on Wei Dai's brilliant recent post comparing boredom and altruism, Vladimir Nesov points out that "you can value lives sublinearly" and still make an expected utility calculation rather than relying on warm-fuzzy intuition.  This got me thinking about just what the functional form of U(Nliving-persons) might be. 

continue reading »

Probability distributions and writing style

2 dclayh 04 June 2009 06:17AM

In his recent post, rhollerith wrote,

I am more likely than not vastly better off than I would have been if <I had made decision X>

This reminded me of the slogan for the water-filtration system my workplaces uses,

We're 100% sure it's 99.9% pure!

because both sentences make a claim and give an associated probability for it. Now in this second example, the actual version is better than the expectation-value-preserving "We're 99.9% sure it's 100% pure", because the actual version implies a lower variance in outcomes (and expectation values being equal, a lower variance is nearly always better).  But this leads to the question of why rhollerith didn't write something like "I am almost certainly at least somewhat better off than I would have been...". 

So I ask: when writing nontechnically, do you prefer to give a modest conclusion with high confidence, or a strong conclusion with moderate confidence?  And does this vary with whether you're trying to persuade or merely describe?

(Also feel free to post other examples of this sort of statement from LW or elsewhere; I'd search for them myself if I had any good ideas on how to do so.)

Fiction of interest

10 dclayh 29 April 2009 06:47PM

The fiction piece in this week's New Yorker deals with some of the same themes as Eliezer's "Three Worlds Collide"; viz., the clash of value systems (and the difficulty of seeing those with a different value system as rational), and the idea of humanity developing in ways that seem bizarre/grotesque/evil to us. 

Silver Chairs, Paternalism, and Akrasia

36 dclayh 09 April 2009 09:24PM

Inspired in part by Robin Hanson's excellent article on paternalism a while back, and in response to the various akrasia posts.

In C.S. Lewis's fourth Narnia book, The Silver Chair, the protagonists (two children and a Marsh-wiggle) are faced with a dilemma regarding the title object.  To wit, they met an eloquent and quite sane-seeming young man, who after a while reveals that he has a mental disorder: for an hour every night, he loses his mind and must be restrained in the Silver Chair; and if he were to be released during that time he would become a giant, evil snake (it is a fantasy novel, after all).  The heroes determine to witness this, and the young man calmly straps himself into the chair.  After a few moments, a change comes over him and he begins struggling and begging for release, claiming the other personality is the false one.  The children are nonplussed: which person(ality) should they believe?  And (a separate question) which should they help?

In the book this dilemma is resolved by means of a cheat*, but we in real life have no such thing.  We do, however, have an abundance of Silver Chairs, in the form of psychotropic drugs from alcohol to hallucinogens to fancy antidepressants and antipsychotics. Of course not every person who takes such drugs is in a Silver Chair situation, but consider for instance the alcoholic who insists he doesn't have a problem, or the paranoid schizophrenic who fears that any drug is an attempt to poison him.  Now we as observers or authorities may know from statistics or even from their personal histories that the detoxxed/drugged-up versions of these people would be happy for the change and not want to return to the previous state, but does that mean it's right (in a paternalistic sense, meaning for their own good) to force them towards what we call mental health?

I would say it is not, that our preference for one side of the Silver Chair over the other is simple bias in favor of mental states similar to our own.  From our places near normality we can't imagine wanting to be in these bizarre mental states, so we assume that the people who are in them don't really want to be either.  They might claim to, sure, but why believe them?  After all, they're crazy.  For two amusing thought experiments in this line which have been considered in detail by others, let the bizarre mental state in question take the values "religious belief", and "sense of humor".  For a sobering real-world application, consider the fate of homosexuals until a few decades ago. And then think about how, as Eliezer has said, the future like the past will be filled with people whose values we would find abhorrent.

continue reading »