Comment author: Houshalter 29 February 2016 01:26:58AM 2 points [-]

That's very unlikely. The universe is billions of years old, yet it would take only mere thousands of years to colonize the galaxy. Maybe millions if they aren't optimally efficient, but still a short time in the history of Earth.

Comment author: AABoyles 01 March 2016 04:09:22PM 0 points [-]

I'm aware. Note that I did call it the "least likely possibility."

Comment author: James_Miller 26 February 2016 04:13:28PM 8 points [-]

It has a belief that capturing the resources of the galaxy would not increase its security nor further its objectives. It doesn't mind interfering with us, which it is implicitly doing by hiding its presence and giving us the Fermi paradox which in turn influences our beliefs about the Great Filter.

Comment author: AABoyles 26 February 2016 06:43:43PM 3 points [-]

For example, maybe they figured out how to convince it to accept some threshold of certainty (so it doesn't eat the universe to prove that it produced exactly 1,000,000 paperclips), it achieved its terminal goal with a tiny amount of energy (less than one star's worth), and halted.

Comment author: James_Miller 26 February 2016 06:18:49PM 3 points [-]

If (1) is true the aliens should fear any type of life developing on other planets because that life would greatly increase the complexity of the galaxy. My guess is that life on earth has, for a very long time, done things to our atmosphere that would allow an advanced civilization to be aware that our planet harbors life.

Comment author: AABoyles 26 February 2016 06:42:11PM 5 points [-]

This is actually a fairly healthy field of study. See, for example, Nonphotosynthetic Pigments as Potential Biosignatures.

Comment author: AABoyles 26 February 2016 06:23:04PM 2 points [-]

The superintelligence could have been written to value-load based on its calculations about an alien (to its creators) superintelligence (what Bostrom refers to as the "Hail Mary" approach). This could cause it to value the natural development of alien biology enough to actively hide its activities from us.

Comment author: AABoyles 26 February 2016 06:28:15PM 0 points [-]

...Think of the Federation's "Prime Directive" in Star Trek.

Comment author: AABoyles 26 February 2016 06:27:03PM 6 points [-]

It may have discovered some property of physics which enabled it to expand more efficiently across alternate universes, rather than across space in any given universe. Thus it would be unlikely to colonize much of any universe (specifically, ours).

Comment author: AABoyles 26 February 2016 06:23:04PM 2 points [-]

The superintelligence could have been written to value-load based on its calculations about an alien (to its creators) superintelligence (what Bostrom refers to as the "Hail Mary" approach). This could cause it to value the natural development of alien biology enough to actively hide its activities from us.

Comment author: AABoyles 26 February 2016 06:08:01PM 1 point [-]

The most obvious and least likely possibility is that the superintelligence hasn't had enough time to colonize the galaxy (i.e. it was created very recently).

Comment author: AABoyles 22 February 2016 02:13:10PM 2 points [-]

The link to the "Strategic Terrorism" paper is malformed. The correct URL is here.

Comment author: AABoyles 25 November 2015 09:12:07PM *  4 points [-]

To take the obvious approach, let's calculate Expected Values for both strategies. To start, let's try two-boxing:

(80/8000 * 1000) + (7920/8000 * 1,001,000) = $991,000

Not bad. OK, how about one-boxing?

(3996/4000 * 1,000,000) + (4/4000 * 0) = $999,000

So one-boxing is the rational strategy (assuming you're seeking to maximize the amount of money you get).

However, this game has two interesting properties which, together, would make me consider one-boxing based on exogenous circumstances. The first is that the difference between the two strategies is very small: only $8000. If I have $990-odd thousand dollars, I'm not going to be hung up the last $8000. In other words, money has a diminishing marginal utility. As a corollary to this, two-boxing guarantees that the player receives at least $1000, where one-boxing could result in the player receiving nothing. Again, because money has a diminishing marginal utility, getting the first $1000 may be worth the risk of not winning the million. If, for example, I needed a sum of money less than $1000 to keep myself alive (with certainty), I would two-box in a heartbeat.

All that said, I would (almost always, certainly) one-box.

Comment author: AABoyles 21 October 2015 07:20:58PM *  8 points [-]

Nobody wants to hear that you will try your best. It is the wrong thing to say. It is like saying "I probably won't hit you with a shovel." Suddenly everyone is afraid you will do the opposite.

--Lemony Snicket, All the Wrong Questions

View more: Next