FormallyknownasRoko comments on Best career models for doing research? - Less Wrong

27 Post author: Kaj_Sotala 07 December 2010 04:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (999)

You are viewing a single comment's thread. Show more comments above.

Comment author: FormallyknownasRoko 07 December 2010 05:46:20PM 7 points [-]

Maybe. The disadvantage is lag time, of course. Discount rate for Singularity is very high. Assume that there are 100 years to the singularity, and that P(success) is linearly decreasing in lag time; then every second approximately 25 galaxies are lost, assuming that the entire 80 billion galaxies' fate is decided then.

25 galaxies per second. Wow.

Comment author: PeerInfinity 12 December 2010 12:56:18AM *  5 points [-]

I'm surprised that noone has asked Roko where he got these numbers from.

Wikipedia says that there are about 80 billion galaxies in the "observable universe", so that part is pretty straightforward. Though there's still the question of why all of them are being counted, when most of them probably aren't reachable with slower-than-light travel.

But I still haven't found any explanation for the "25 galaxies per second". Is this the rate at which the galaxies burn out? Or the rate at which something else causes them to be unreachable? Is it the number of galaxies, multiplied by the distance to the edge of the observable universe, divided by the speed of light?

calculating...

Wikipedia says that the comoving distance from Earth to the edge of the observable universe is about 14 billion parsecs (46 billion light-years short scale, i.e. 4.6 × 10^10 light years) in any direction.

Google Calculator says 80 billion galaxies / 46 billion light years = 1.73 galaxies per year, or 5.48 × 10^-8 galaxies per second

so no, that's not it.

If I'm going to allow my mind to be blown by this number, I would like to know where the number came from.

Comment author: Caspian 12 December 2010 02:54:00AM *  2 points [-]

I also took a while to understand what was meant, so here is my understanding of the meaning:

Assumptions: There will be a singularity in 100 years. If the proposed research is started now it will be a successful singularity, e.g. friendly AI. If the proposed research isn't started by the time of the singularity, it will be a unsuccessful (negative) singularity, but still a singularity. The probability of the successful singularity linearly decreases with the time when the research starts, from 100 percent now, to 0 percent in 100 years time.

A 1 in 80 billion chance of saving 80 billion galaxies is equivalent to definitely saving 1 galaxy, and the linearly decreasing chance of a successful singularity affecting all of them is equivalent to a linearly decreasing number being affected. 25 galaxies per second is the rate of that decrease.

Comment author: FormallyknownasRoko 12 December 2010 12:58:20AM 2 points [-]

I meant if you divide the number of galaxies by the number of seconds to an event 100 years from now. Yes, not all reachable. Probably need to discount by an order of magnitude for reachability at lightspeed.

Comment author: FAWS 12 December 2010 02:00:39AM 0 points [-]

Hmm, by the second wikipedia link there is no basis for the 80 billion galaxies since only a relatively small fraction of the observable universe (4.2%?) is reachable if limited by the speed of light, and if not the whole universe is probably at least 10^23 times larger (by volume or by radius?).

Comment author: shokwave 07 December 2010 06:01:32PM 3 points [-]

Guh. Every now and then something reminds me of how important the Singularity is. Time to reliable life extension is measured in lives per minute, time to Singularity is measured in galaxies per second.

Comment author: FormallyknownasRoko 07 December 2010 06:04:43PM *  1 point [-]

Well conservatively assuming that each galaxy supports lives at 10^9 per sun per century (1/10th of our solar system), that's already 10^29 lives per second right there.

And assuming utilization of all the output of the sun for living, i.e. some kind of giant spherical shell of habitable land, we can add another 12 orders of magnitude straight away. Then if we upload people that's probably another 10 orders of magnitude.

Probably up to 10^50 lives per second, without assuming any new physics could be discovered (a dubious assumption). If instead we assume that quantum gravity gives us as much of an increase in power as going from newtonian physics to quantum mechanics did, we can pretty much slap another 20 orders of magnitude onto it, with some small probability of the answer being "infinity".

Comment author: MartinB 08 December 2010 10:22:04AM 1 point [-]

Now thats a way to eat up your brain.

Comment author: XFrequentist 07 December 2010 08:46:32PM 2 points [-]

In what I take to be a positive step towards viscerally conquering my scope neglect, I got a wave of chills reading this.

Comment author: timtyler 08 December 2010 10:03:24AM 1 point [-]

That seems like a rather exaggerated sense of importance. It may be a fun fantasy in which the fate of the entire universe hangs in the balance in the next century - but do bear in mind the disconnect between that and the real world.

Comment author: shokwave 08 December 2010 03:33:28PM 3 points [-]

the disconnect between that and the real world.

Out of curiosity: what evidence would convince you that the fate of the entire universe does hang in the balance?

Comment author: Manfred 08 December 2010 10:16:19PM 2 points [-]

No human-comparable aliens, for one.

Which seems awfully unlikely, the more we learn about solar systems.

Comment author: timtyler 08 December 2010 10:47:42PM -1 points [-]

"Convince me" - with some unspecified level of confidence? That is not a great question :-|

We lack knowlegde of the existence (or non-existence) of aliens in other galaxies. Until we have such knowledge, our uncertainty on this matter will necessarily be high - and we should not be "convinced" of anything.

Comment author: shokwave 09 December 2010 04:49:23AM 1 point [-]

What evidence would convince you, with 95% confidence, that the fate of the universe hangs in the balance in this next century on Earth?

You may specify evidence such as "strong evidence that we are completely alone in the universe" even if you think it is unlikely we will get such evidence.

Comment author: timtyler 09 December 2010 07:20:31AM -1 points [-]

I did get the gist of your question the first time - and answered according. The question takes us far into counter-factual territory, though.

Comment author: shokwave 09 December 2010 07:49:11AM 1 point [-]

I was just curious to see if you rejected the fantasy on principle, or if you had other reasons.

Comment author: [deleted] 08 December 2010 02:02:04PM 1 point [-]

assuming that the entire 80 billion galaxies' fate is decided then.

What's your P of "the fate of all 80 billion galaxies will be decided on Earth in the next 100 years"?

Comment author: FormallyknownasRoko 08 December 2010 02:15:18PM 0 points [-]

Some complexities regarding "decided" since physics is deterministic, but hand waving that aside, I'd say 50%.

Comment author: [deleted] 09 December 2010 12:50:42AM 1 point [-]

With high probability, many of those galaxies are already populated. Is that irrelevant?

Comment author: FormallyknownasRoko 09 December 2010 12:24:19PM 0 points [-]

I disagree. I claim that the probability of >50% of the universe being already populated (using the space of simultaneity defined by a frame of reference comoving with earth) is maybe 10%.

Comment author: [deleted] 09 December 2010 01:33:40PM 0 points [-]

"Already populated" is a red herring. What's the probability that >50% of the universe will ever be populated? I don't see any reason for it to be sensitive to how well things go on Earth in the next 100 years.

Comment author: FormallyknownasRoko 09 December 2010 06:32:33PM 1 point [-]

I think it is likely that we are the only spontaneously-created intelligent species in the entire 4-manifold that is the universe, space and time included (excluding species which we might create in the future, of course).

Comment author: [deleted] 09 December 2010 06:58:44PM 1 point [-]

I'm curious to know how likely, and why. But do you agree that aliens are relevant to evaluating astronomical waste?

Comment author: timtyler 09 December 2010 06:37:12PM 0 points [-]

That seems contrary to the http://en.wikipedia.org/wiki/Self-Indication_Assumption

Do you have a critique - or a supporting argument?

Comment author: FormallyknownasRoko 09 December 2010 06:38:43PM *  3 points [-]

Yes, I have a critique. Most of anthropics is gibberish. Until someone makes anthropics work, I refuse to update on any of it. (Apart from the bits that are commonsensical enough to derive without knowing about "anthropics", e.g. that if your fising net has holes 2 inches big, don't expect to catch fish smaller then 2 inches wide)

Comment author: timtyler 09 December 2010 07:59:08PM 2 points [-]

I don't think you can really avoid anthropic ideas - or the universe stops making sense. Some anthropic ideas can be challenging - but I think we have got to try.

Anyway, you did the critique - but didn't go for a supporting argument. I can't think of very much that you could say. We don't have very much idea yet about what's out there - and claims to know such things just seem over-confident.

Comment author: [deleted] 09 December 2010 06:55:14PM 0 points [-]

There are strained applications of anthropics, like the doomsday argument. "What happened here might happen elsewhere" is much more innocuous.

Comment author: Vladimir_Nesov 09 December 2010 06:40:13PM 0 points [-]

I agree.

Comment author: [deleted] 09 December 2010 06:46:12PM *  2 points [-]

Even Nick Bostrom, who is arguably the leading expert on anthropic problems, rejects SIA for a number of reasons (see his book Anthropic Bias). That alone is a pretty big blow to its credibility.

Comment author: timtyler 09 December 2010 07:49:53PM *  0 points [-]

That is curious. Anyway, the self-indication assumption seems fairly straight-forwards (as much as any anthropic reasoning is, anyway). The critical material from Bostrom on the topic I have read seems unpersuasive. He doesn't seem to "get" the motivation for the idea in the first place.

Comment author: Kevin 09 December 2010 02:28:22PM *  0 points [-]

If you think there is a significant probability that an intelligence explosion is possible or likely, then that question is sensitive to how well things go on Earth in the next 100 years.

Comment author: [deleted] 09 December 2010 03:06:06PM *  3 points [-]

However likely they are, I expect intelligence explosions to be evenly distributed through space and time. If 100 years from now Earth loses by a hair, there are still plenty of folks around the universe who will win or have won by a hair. They'll make whatever use of the 80 billion galaxies that they can--will they be wasting them?

If Earth wins by a hair, or by a lot, we'll be competing with those folks. This also significantly reduces the opportunity cost Roko was referring to.

Comment author: Vladimir_Nesov 08 December 2010 02:31:26PM *  0 points [-]

About 10% (if we ignore existential risk, which is a way of resolving the ambiguity of "will be decided"). Multiply that by opportunity cost of 80 billion galaxies.

Comment author: David_Gerard 08 December 2010 02:58:15PM 1 point [-]

Could you please detail your working to get to this 10% number? I'm interested in how one would derive it, in detail.

Comment author: Vladimir_Nesov 08 December 2010 03:20:26PM *  0 points [-]

I restored the question as asking about probability that we'll be finishing an FAI project in the next 100 years. Dying of engineered virus doesn't seem like an example of "deciding the fate of 80 billion galaxies", although it's determining that fate.

FAI looks really hard. Improvements in mathematical understanding to bridge comparable gaps in understanding can take at least many decades. I don't expect a reasonable attempt at actually building a FAI anytime soon (crazy potentially world-destroying AGI projects go in the same category as engineered viruses). One possible shortcut is ems, that effectively compress the required time, but I estimate that they probably won't be here for at least 80 more years, and then they'll still need time to become strong enough and break the problem. (By that time, biological intelligence amplification could take over as a deciding factor, using clarity of thought instead of lots of time to think.)

Comment author: [deleted] 09 December 2010 01:00:46AM 0 points [-]

My question has only a little bit to do with the probability that an AI project is successful. It has mostly to do with P(universe goes to waste | AI projects are unsuccessful). For instance, couldn't the universe go on generating human utility after humans go extinct?

Comment author: ata 09 December 2010 01:05:09AM 1 point [-]

For instance, couldn't the universe go on generating human utility after humans go extinct?

How? By coincidence?

(I'm assuming you also mean no posthumans, if humans go extinct and AI is unsuccessful.)

Comment author: [deleted] 09 December 2010 01:23:45AM 2 points [-]

Aliens. I would be pleased to learn that something amazing was happening (or was going to happen, long "after" I was dead) in one of those galaxies. Since it's quite likely that something amazing is happening in one of those 80 billion galaxies, shouldn't I be pleased even without learning about it?

Of course, I would be correspondingly distressed to learn that something horrible was happening in one of those galaxies.