linkhyrule5 comments on Open thread, July 29-August 4, 2013 - Less Wrong

3 Post author: David_Gerard 29 July 2013 10:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (381)

You are viewing a single comment's thread.

Comment author: linkhyrule5 02 August 2013 08:04:07AM 5 points [-]

Waffled between putting this here and putting this in the Stupid Questions thread:

Why is the default assumption that a superintelligence of any type will populate its light cone?

I can see why any sort of tiling AI would do this - paperclip maximizers and the like. And for obvious reasons there's an inherent problem with predicting the actions of an alien FAI (friendly relative to alien values).

But it certainly seems to me that a human CEV-equivalent wouldn't necessarily support lightspeed expansion. Certainly, humanity has expanded whenever it has the opportunity - but not at its maximum speed, nor did entire population centers move. The top few percent of adventurous or less-affluent people leave, and that is all.

On top of this, I ... well, I can't say "can't imagine," but I find it unlikely that a CEV would support mass cloning or generation of humans (though if it supports mass uploading, then accelerated living might produce a population boom sufficient to support luminal expansion.) In which case, an FAI that did occupy as much space as possible, as rapidly as possible, would find itself spending resources on planets that wouldn't be used for millenia, when it could instead focus on improving local life.

There is, of course, the intelligence-explosion argument, but I'd think even intelligence would hit diminishing marginal returns eventually.

So to sum up, it seems not unreasonable that certain plausible categories of superintelligences would willingly not expand at near-luminal velocities - in which case there's quite a bit more leeway in the Fermi Paradox.

Comment author: Oscar_Cunningham 02 August 2013 11:22:48AM *  4 points [-]

It's because we want to secure as many resources as possible, before the aliens get to them.

I expect an FAI to expand rapidly, but merely securing resources and saving them for humans to use much later.

Comment author: Lumifer 02 August 2013 08:04:12PM 1 point [-]

I expect an FAI to expand rapidly, but merely securing resources and saving them for humans to use much later.

So maybe the Solar System has been secured by an alien-FAI and we're being saved for the aliens to use much later..?

Comment author: Oscar_Cunningham 02 August 2013 08:36:52PM 1 point [-]

It's totally possible, but they'd have to have a good reason for staying hidden for the reason nyan_sandwich gives.

Comment author: [deleted] 02 August 2013 08:16:15PM 1 point [-]

Most valuable of those resources is free energy. The sun is burning that into low grade light and heat at an incredible rate.

Comment author: Lumifer 02 August 2013 08:41:20PM 2 points [-]

So does that imply that a rapidly expanding resource-saving FAI would go around extinguishing stars?

Comment author: [deleted] 02 August 2013 10:10:58PM 8 points [-]

Seems prudent to do.

Unless it values the existence of stars more than it values other things it could do with that energy.

Comment author: Nisan 04 August 2013 04:03:26PM 4 points [-]

Upvoted for being the first instance I've seen of someone describing extinguishing all the stars in the night sky as being prudent.

Comment author: DanielLC 03 August 2013 03:23:38AM 1 point [-]

I suspect using them is more likely. They certainly aren't going to just let them keep wasting fuel. Not unless they have the opportunity to prevent even more waste. For example, they will send out probes to other systems before worrying too much about this system.

Comment author: Oscar_Cunningham 03 August 2013 12:16:39AM 1 point [-]

extinguishing stars

Is that even possible!? The FAI would want to somehow pause the burning of the star, allowing it to begin producing energy again when needed. For example collapsing it into a black hole wouldn't be what we want, since the energy would be wasted.

Would star lifting be enough to slow the burning of a star to a standstill?

Comment author: wadavis 02 August 2013 02:59:09PM 1 point [-]

Read up on the Dominion Lands Act and the Homestead Act for a historic human precedent.

Comment author: linkhyrule5 02 August 2013 07:32:32PM 1 point [-]

Right, but I'm not sure that's the right precedent to use. Space is big: it'd be more equivalent to, oh, dumping the Lost Roman Legion in a prehistoric Asia and expecting them to divvy up the continent as fast as they could march.

Comment author: wadavis 02 August 2013 08:19:43PM *  3 points [-]

Davy Jones: One Soul is not equal to another

Jack Sparrow: Aha! So we've established my proposal is sound in principle, now we're just haggling over price.

-- Pirates of the Caribbean: Dead Man's Chest

Or in this case, scope instead of price.

Jokes aside, the point being is the sponsored settlement of the prairies had an influence of the negotiations of the Canada / U.S.A. border. If an human civilization had belief that it may have future competition with aliens for territory in space it would make sense to them to secured as much as possible as a Schelling Point in negotiations / conflicts.

Comment author: linkhyrule5 03 August 2013 12:32:34AM 1 point [-]

Point granted.

... and once an FAI has sent out probes to claim territory anyway, it loses nothing by making those probes nanotech with a copy of the FAI loaded on it, so we would indeed expect to see lightspeed expansions of FAI-controlled civilizations. Fair enough, then.

Comment author: linkhyrule5 02 August 2013 07:35:35PM 0 points [-]

Hm. Point.

Comment author: DanielLC 03 August 2013 03:26:45AM 3 points [-]

Due to the way the universe expends, even if you travel at the speed of light forever, you can only reach a finite portion of it. The longer you wait, the less that is. Because of this, an AI that doesn't send out probes as fast as possible and, to a lesser extent, as soon as possible, will only be able to control a smaller portion of the universe. If you have any preferences about what happens in the rest of the universe, you'd want to leave early.

Also, as Oscar said, you don't want the resources you can easily reach to go to waste while you're putting off using them.