AdeleneDawner comments on Brief question about Conway's Game of LIfe and AI - Less Wrong

13 [deleted] 02 June 2011 02:51AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread. Show more comments above.

Comment author: AdeleneDawner 05 June 2011 01:56:41PM 3 points [-]

We may stare at the empty plane and ask ourselves if this is the graveyard of a superintelligence, once lived here and conquered the plane for a brief time, then vanished in a collapse. Several gliders and roses could be everything what remained, as some dry fossils.

Or, we could find that the playing field stabilizes to something that can easily be interpreted as a superintelligence's preferred state - perhaps with the field divided into subsections in which interesting things happen in repeated cycles, or whatever.

Comment author: orthonormal 05 June 2011 03:20:00PM 2 points [-]

Interesting question: why does this (intuitively and irrationally) seem to me like a sadder fate than something like heat death?

Comment author: AdeleneDawner 05 June 2011 03:28:39PM *  3 points [-]

Because it takes the meaning out of the accomplishment? In this scenario, there might be something interpretable as a superintelligence that exists at some point before the scenario settles into repeating, but the end state still seems more to be caused by the initial state than by the superintelligence.

Alternately, it could be because you value novelty, and the repeating nature of the stabilized field precludes that in a way that's more emotionally salient than heat death.

Comment author: orthonormal 05 June 2011 06:36:56PM 2 points [-]

the end state still seems more to be caused by the initial state than by the superintelligence

But this is true of the heat death of the universe, too, eventually...

Comment author: cousin_it 13 June 2011 12:48:33AM *  1 point [-]

The only long-term scenario I know that avoids both this outcome and heat death involves launching colonies into the "dust" as Greg Egan described. Unfortunately the assumptions required for that to work may well turn out to be false. There's no known law of nature saying we can't be trapped in the long term. And if we are trapped, I think I prefer a repeating paradise to heat death.

Comment author: JoshuaZ 06 June 2011 04:31:00PM *  0 points [-]

I don't know. Heat death seems a lot sadder to me, in part because I know that there's at least one universe where it will probably happen. Maybe you are just more used to the notion of heat death and so have digested that sour grape but not this one?

Comment author: red75 15 June 2011 01:19:09AM 0 points [-]

I wonder why rational consequentialist agent should do anything but channel all available resources into instrumental goal of finding a way to circumvent heat death. Mixed strategies are obviously suboptimal as expected utility of heat death circumvention is infinite.