Eliezer_Yudkowsky comments on Rationality Quotes September 2012 - Less Wrong

7 Post author: Jayson_Virissimo 03 September 2012 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1088)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 18 September 2012 01:53:02PM 2 points [-]

Depends how you interpret the proverb. If you told me the Earth would last a hundred years, it would increase the immediate priority of CFAR and decrease that of SIAI. It's a moot point since the Earth won't last a hundred years.

Comment author: [deleted] 18 September 2012 01:57:50PM 5 points [-]

Sorry, Earth won't last a hundred years?

Comment author: MugaSofer 18 September 2012 02:01:51PM 5 points [-]

Nanotech and/or UFAI.

Comment author: Mitchell_Porter 18 September 2012 11:04:28PM 3 points [-]

The idea seems to be that even if there is a friendly singularity, Earth will be turned into computronium or otherwise transformed.

Comment author: TheOtherDave 18 September 2012 03:21:48PM 4 points [-]

I am surprised that this claim surprises you. A big part of SI's claimed value proposition is the idea that humanity is on the cusp of developing technologies that will kill us all if not implemented in specific ways that non-SI folk don't take seriously enough.

Comment author: [deleted] 18 September 2012 11:01:14PM 1 point [-]

Of course you're right. I guess I haven't noticed the topic come up here for a while, and haven't seen the apocalypse predicted so straightforwardly (and quantitatively) before so am surprised in spite of myself.

Although, in context, it sounds like EY is saying that the apocalypse is so inevitable that there's no need to make plans for the alternative. Is that really the consensus at EY's institute?

Comment author: TheOtherDave 19 September 2012 01:15:13AM 1 point [-]

I have no idea what the consensus at SI is.

Comment author: [deleted] 18 September 2012 05:32:25PM 0 points [-]

I guess he means “only last a hundred years”, not “last at least a hundred years”.

Comment author: TheOtherDave 18 September 2012 05:42:46PM 0 points [-]

Just to make sure I understand: you interpret EY to be saying that the Earth will last more than a hundred years, not saying that the Earth will fail to last more than a hundred years. Yes?

If so, can you clarify how you arrive at that interpretation?

Comment author: [deleted] 18 September 2012 05:50:15PM *  0 points [-]

“If you told me the Earth would only last a hundred years (i.e. won't last longer than that) .... It's a moot point since the Earth won't only last a hundred year (i.e. it will last longer).” At least that's what I got on the first reading.

I think I could kind-of make sense “it would increase the immediate priority of CFAR and decrease that of SIAI” under either hypothesis about what he means, though one interpretation would need to be more strained than the other.

Comment author: ArisKatsaris 18 September 2012 05:58:35PM *  4 points [-]

The idea is that if Earth lasts at least a hundred years, (if that's a given), then the possibility of a uFAI in that timespan severely decreases -- so SIAI (which seeks to prevent a uFAI by building a FAI) is less of an immediate priority and it becomes a higher priority to develop CFAR that will increase the public's rationality for the future generations, so that the future generations don't launch a uFAI.

Comment author: [deleted] 18 September 2012 06:10:11PM *  0 points [-]

(The other interpretation would be “If the Earth is going to only last a hundred years, then there's not much point in trying to make a FAI since in the long-term we're screwed anyway, and raising the sanity waterline will make us enjoy more what time there is left.)

EDIT: Also, if your interpretation is correct, by saying that the Earth won't last 100 years he's either admitting defeat (i.e. saying that an uFAI will be built) or saying that even a FAI would destroy the Earth within 100 years (which sounds unlikely to me -- even if the CEV of humanity would eventually want to do that, I guess it would take more than 100 years to terraform another place for us to live and for us all to move there).

Comment author: Eliezer_Yudkowsky 19 September 2012 03:14:24PM 3 points [-]

I was just using "Earth" as a synonym for "the world as we know it".

Comment author: MixedNuts 19 September 2012 05:18:46PM 11 points [-]

I think I disagree; care to make it precise enough to bet on? I'm expecting life still around, Earth the main population center, most humans not uploaded, some people dying of disease or old age or in wars, most people performing dispreferred activities in exchange for scarce resources at least a couple months in their lives, most children coming out of a biological parent and not allowed to take major decisions for themselves for at least a decade.

I'm offering $100 at even odds right now and will probably want to bet again in the next few years. I can give it to you (if you're going to transfer it to SIAI/CFAR tell me and I'll donate directly), and you pay me $200 if the world has not ended in 100 years as soon as we're both available (e.g. thawed). If you die you can keep the money; if I die then win give it to some sensible charity.

How's that sound? All of the above is up for negotiation.

Comment author: wedrifid 20 September 2012 10:36:18AM *  4 points [-]

I'm offering $100 at even odds right now and will probably want to bet again in the next few years. I can give it to you (if you're going to transfer it to SIAI/CFAR tell me and I'll donate directly), and you pay me $200 if the world has not ended in 100 years as soon as we're both available (e.g. thawed). If you die you can keep the money; if I die then win give it to some sensible charity.

(Neglecting any logistic or legal isses) this sounds like a no brainer for Eliezer (accept).

How's that sound?

Like you would be better served by making the amounts you give and expect to receive if you win somewhat more proportionate to expected utility of the resources at the time. If Eliezer was sure he was going to lose he should still take the low interest loan.

Even once the above is accounted for Eliezer should still accept the bet (in principle).

Comment author: Mitchell_Porter 26 September 2012 11:32:34PM 3 points [-]

I'm expecting ...

That's a nice set of criteria by which to distinguish various futures (and futurists).

Comment author: Eliezer_Yudkowsky 26 September 2012 11:01:32PM 6 points [-]

As wedifrid says, this is a no-brainer "accept" (including the purchasing-power-adjusted caveat). If you are inside the US and itemize deductions, please donate to SIAI, otherwise I'll accept via Paypal. Your implied annual interest rate assuming a 100% probability of winning is 0.7% (plus inflation adjustment). Please let me know whether you decide to go through with it; withdrawal is completely understandable - I have no particular desire for money at the cost of forcing someone else to go through with a bet they feel uncomfortable about. (Or rather, my desire for $100 is not this strong - I would probably find $100,000 much more tempting.)

Comment author: MugaSofer 27 September 2012 11:43:34AM 0 points [-]

I'm expecting [...] some people dying of disease or old age or in wars

Care to explain why? You sound like you expect nanotech by then.

Comment author: [deleted] 19 September 2012 05:20:43PM *  2 points [-]

(I guess I had been primed to take “Earth” to mean ‘a planet or dwarf planet (according to the current IAU definition) orbiting the Sun between Venus and Mars’ by this. EDIT: Dragon Ball too, where destroying a planet means turning it into dust, not just rendering it inhabitable.)

Comment author: ciphergoth 20 September 2012 06:41:05AM 1 point [-]

I feel an REM song coming on...

Comment author: ArisKatsaris 18 September 2012 07:32:58PM 1 point [-]

Also, if your interpretation is correct, by saying that the Earth won't last 100 years he's either admitting defeat (i.e. saying that an uFAI will be built

EY does seem in a darker mood than usual lately, so it wouldn't surprise me to see him implying pessimism about our chances out loud, even if it doesn't go so far as "admitting defeat". I do hope it's just a mood, rather than that he has rationally updated his estimation of our chances of survival to be even lower than they already were. :-)

Comment author: Decius 26 September 2012 06:04:47PM 2 points [-]

"The world as we know it" ends if FAI is released into the wild.

Comment author: [deleted] 27 September 2012 03:51:00PM 2 points [-]

When I had commented, EY hadn't clarified yet that by Earth he meant “the world as we know it”, so I didn't expect “Earth” to exclude ‘the planet between Venus and Mars 50 years after a FAI is started on it’.

Comment author: Decius 26 September 2012 05:48:45PM 0 points [-]

So, we can construct an argument that CFAR would rise in relative importance over SIAIif we see strong evidence the world as we know it will end within 100 years, and an argument with the same conclusion if we see strong evidence that the world as we know it will last for at least 100 years.

There is something wrong.