ata comments on The Importance of Self-Doubt - Less Wrong

23 Post author: multifoliaterose 19 August 2010 10:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (726)

You are viewing a single comment's thread. Show more comments above.

Comment author: ata 20 August 2010 06:23:07PM *  0 points [-]

I must know, have you actually encountered people who literally think that? I'm really hoping that's a comical exaggeration, but I guess I should not overestimate human brains.

Comment author: Eliezer_Yudkowsky 20 August 2010 06:43:32PM 5 points [-]

I've encountered people who think Singularitarians think that, never any actual Singularitarians who think that.

Comment author: ata 20 August 2010 07:14:44PM *  8 points [-]

Yeah, "people who think Singularitarians think that" is what I meant.

I've actually met exactly one something-like-a-Singularitarian who did think something-like-that — it was at one of the Bay Area meetups, so you may or may not have talked to him, but anyway, he was saying that only people who invent or otherwise contribute to the development of Singularity technology would "deserve" to actually benefit from a positive Singularity. He wasn't exactly saying he believed that the nonbelievers would be left to languish when cometh the Singularity, but he seemed to be saying that they should.

Also, I think he tried to convert me to Objectivism.

Comment author: timtyler 20 August 2010 08:13:06PM *  -1 points [-]

Technological progress has increased weath inequality a great deal so far.

Machine intelligence probably has the potential to result in enormous weath inequality.

Comment author: WrongBot 20 August 2010 09:19:49PM 1 point [-]

How, in a post-AGI world, would you define wealth? Computational resources? Matter?

I don't think there's any foundation for speculation on this topic at this time.

Comment author: khafra 20 August 2010 09:34:50PM *  2 points [-]

Unless we get a hard-takeoff singleton, which is admittedly the SIAI expectation, there will be massive inequality, with a few very wealthy beings and average income barely above subsistence. Thus saith Robin Hanson, and I've never seen any significant holes poked in that thesis.

Comment author: WrongBot 20 August 2010 09:45:48PM 0 points [-]

Robin Hanson seems to be assuming that human preferences will, in general, remain in their current ranges. This strikes me as unlikely in the face of technological self-modification.

Comment author: khafra 20 August 2010 11:20:07PM 2 points [-]

I've never gotten that impression. What I've gotten is that evolutionary pressures will, in the long term, still exist--even if technological self-modification leads to a population that's 99.99% satisfied to live within strict resource consumption limits, unless they harshly punish defectors the .01% with a drive for replication or expansion will overwhelm the rest within a few millenia, until the average income is back to subsistence. This doesn't depend on human preferences, just the laws of physics and natural selection.

Comment author: WrongBot 21 August 2010 12:41:55AM 1 point [-]

What evolutionary pressures? Even making the incredible assumption that we will continue to use sequences of genes as a large part of our identities, what's to stop a singleton of some variety from eliminating drives for replication or expansion entirely?

I feel uncomfortable speculating about a post-machine-intelligence future even to this extent; this is not a realm in which I am confident about any proposition. Consequently, I view all confident conclusions with great skepticism.

Comment author: khafra 21 August 2010 01:21:10AM 4 points [-]

You're still not getting the breadth and generality of Hanson's model. To use recent LW terminology, it's an anti-prediction.

It doesn't matter whether agents perpetuate their strategies by DNA mixing, binary fission, cellular automata, or cave paintings. Even if all but a tiny minority of posthumans self-modify not to want growth or replication, the few that don't will soon dominate the light-cone. A singleton, like I'd mentioned, is one way to avert this. Universal extinction and harsh, immediate punishment of expansion-oriented agents are the only others I see.

Comment author: wedrifid 21 August 2010 01:47:05AM *  0 points [-]

What evolutionary pressures? Even making the incredible assumption that we will continue to use sequences of genes as a large part of our identities, what's to stop a singleton of some variety from eliminating drives for replication or expansion entirely?

Your point is spot on. Competition can not be relied on to produce adaptation if someone wins the competition once and for all.

Comment author: Vladimir_Nesov 20 August 2010 09:35:47PM 1 point [-]

Control, owned by preferences.

Comment author: timtyler 20 August 2010 09:28:01PM *  0 points [-]

I wasn't trying to make an especially long-term prediction:

"We saw the first millionaire in 1716, the first billionaire in 1916 - and can expect the first trillionaire within the next decade - probably before 2016."

Comment author: WrongBot 20 August 2010 09:41:32PM *  5 points [-]
  1. Inflation.

  2. The richest person on earth currently has a net worth of $53.5 billion.

  3. The greatest peak net worth in recorded history, adjusted for inflation, was Bill Gates' $101 billion, which was ten years ago. No one since then has come close. A 10-fold increase in <6 years strikes me as unlikely.

  4. In any case, your extrapolated curve points to 2116, not 2016.

I am increasingly convinced that your comments on this topic are made in less than good faith.

Comment author: timtyler 20 August 2010 10:59:57PM *  0 points [-]

Yes, the last figure looks wrong to me too - hopefully I will revisit the issue.

Update 2011-05-30: yes: 2016 was a simple math mistake! I have updated the text I was quoting from to read "later this century".

Anyway, the huge modern wealth inequalities are well established - and projecting them into the future doesn't seem especially controversial. Today's winners in IT are hugely rich - and tomorrow's winners may well be even richer. People thinking something like they will "be on the inside track when the singularity happens" would not be very surprising.

Comment author: WrongBot 21 August 2010 12:35:05AM 3 points [-]

Anyway, the huge modern wealth inequalities are well established - and projecting them into the future doesn't seem especially controversial.

Projecting anything into a future with non-human intelligences is controversial. You have made an incredibly large assumption without realizing it. Please update.

Comment author: timtyler 21 August 2010 06:24:12AM *  0 points [-]

If you actually want your questions answered, then money is society's representation of utility - and I think there will probably be something like that in the future - no matter how far out you go. What you may not find further out is "people". However, I wasn't talking about any of that, really. I just meant while there are still money and people with bank accounts around.

Comment author: timtyler 20 August 2010 07:12:35PM 0 points [-]

What about the recent "forbidden topic"? Surely that is a prime example of this kind of thing.

Comment author: timtyler 20 August 2010 07:07:51PM *  5 points [-]

"It's basically a modern version of a religious belief system and there's no purpose to it, like why, why must we have another one of these things ... you get an afterlife out of it because you'll be on the inside track when the singularity happens - it's got all the trappings of a religion, it's the same thing." - Jaron here.