Alicorn comments on Open Thread: March 2010, part 3 - Less Wrong

3 Post author: RobinZ 19 March 2010 03:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (254)

You are viewing a single comment's thread.

Comment author: Alicorn 31 March 2010 12:38:19AM 3 points [-]

If you had to tile the universe with something - something simple - what would you tile it with?

Comment author: Clippy 31 March 2010 01:29:23AM 4 points [-]

Paperclips.

Comment author: RobinZ 31 March 2010 12:51:09AM 4 points [-]

I have no interest in tiling the universe with anything - that would be dull. Therefore I would strive to subvert the spirit of such a restriction as effectively as I could. Off the top of my head, pre-supernova stars seem like adequate tools for the purpose.

Comment author: Mitchell_Porter 31 March 2010 02:15:56AM 2 points [-]

Are you sure that indiscriminately creating life in this fashion is a good thing?

Comment author: RobinZ 31 March 2010 02:33:57AM 1 point [-]

No, but given the restrictions of the hypothetical it's on my list of possible courses of action. Were there any possibility of my being forced to make the choice, I would definitely want more options than just this one to choose from.

Comment author: jimrandomh 31 March 2010 03:35:28AM *  2 points [-]

If you had to tile the universe with something - something simple - what would you tile it with?

Copies of my genome. If I can't do anything to affect the utility function I really care about, then I might as well optimize the one evolution tried to make me care about instead.

(Note that I interpret 'simple' as excluding copies of my mind, simulations of interesting universes, and messages intended for other universes that simulate this one to read, any of which would be preferable to anything simple.)

Comment author: Mitchell_Porter 31 March 2010 12:44:03AM 2 points [-]

Can the tiles have states that change and interact?

Comment author: Alicorn 31 March 2010 12:47:19AM 0 points [-]

Only if that doesn't violate the "simple" condition.

Comment author: ata 31 March 2010 01:37:17AM *  0 points [-]

What counts as simple?

If something capable as serving as a cell in a cellular automaton would count as simple enough, I'd choose that. And I'd design it to very occasionally malfunction and change states at random, so that interesting patterns could spontaneously form in the absence of any specific design.

Comment author: Alicorn 31 March 2010 01:41:07AM *  1 point [-]

Basically, the "simple" condition was designed to elicit answers more along the lines of "paperclips!" or "cheesecake!", rather than "how can I game the system so that I can have interesting stuff in the universe again after the tiling happens?" You're not playing fair if you do that.

I find this an interesting question because while it does seem to be a consensus that we don't want the universe tiled with orgasmium, it also seems intuitively obvious that this would be less bad than tiling the universe with agonium or whatever you'd call it; and I want to know what floats to the top of this stack of badness.

Comment author: Clippy 31 March 2010 02:01:05AM 2 points [-]

Basically, the "simple" condition was designed to elicit answers more along the lines of "paperclips!"

Mission accomplished! c=@

Now, since there seems to be a broad consensus among the posters that paperclips would be the optimal thing to tile the universe with, how about we get to work on it?

Comment author: wedrifid 31 March 2010 04:56:58AM 1 point [-]

Basically, the "simple" condition was designed to elicit answers more along the lines of "paperclips!" or "cheesecake!", rather than "how can I game the system so that I can have interesting stuff in the universe again after the tiling happens?" You're not playing fair if you do that.

And that is a good thing. Long live the munchkins of the universe!

Comment author: RobinZ 31 March 2010 01:55:03AM 0 points [-]

I think orgasmium is significantly more complex than cheesecake. Possibly complex enough that I could make an interesting universe if I were permitted that much complexity, but I don't know enough about consciousness to say.

Comment author: Peter_de_Blanc 31 March 2010 04:55:11AM 2 points [-]

Cheesecake is made of eukaryotic life, so it's pretty darn complex.

Comment author: wedrifid 31 March 2010 05:07:35AM *  5 points [-]

Hmm... a universe full of cheescake will have enough hydrogen around to form stars once the cheesecakes attract each other, with further cheescake forming to planets that are are a perfect breeding ground for life, already seeded with DNA and RNA!

Comment author: RobinZ 31 March 2010 11:00:02AM *  0 points [-]

Didn't think of that. Okay, orgasmium is significantly more complex than paperclips.

Comment author: wnoise 31 March 2010 05:29:39AM 0 points [-]

What? It's products of eukaryotic life. Usually the eukaryotes are dead. Though plenty of microorganisms immediately start colonizing.

Unless you mean the other kind of cheesecake.

Comment author: Peter_de_Blanc 31 March 2010 07:42:41PM 0 points [-]

I suppose that the majority of the cheesecake does not consist of eukaryotic cells, but there are definitely plenty of them in there. I've never looked at milk under a microscope but I would expect it to contain cells from the cow. The lemon zest contains lemon cells. The graham cracker crust contains wheat. Dead cells would not be much simpler than living cells.

Comment author: JGWeissman 31 March 2010 01:51:04AM 1 point [-]

I have no preferences within the class of states of the universe that do not, and cannot evolve to, contain consciousness.

But if, for example, I was put in this situation by a cheesecake maximizer, I would choose something other than cheese cake.

Comment author: Alicorn 31 March 2010 04:17:40AM 1 point [-]

Interesting. Just to be contrary?

Comment author: JGWeissman 31 March 2010 04:56:54AM 4 points [-]

Because, as near as I can calculate, UDT advises me too. Like what Wedrifid said.

And like Eliezer said here:

Or the Countess just decides not to pay, unconditional on anything the Baron does. Also, if the Baron ends up in an infinite loop or failing to resolve the way the Baron wants to, that is not really the Countess's problem.

And here:

As I always press the "Reset" button in situations like this, I will never find myself in such a situation.

EDIT: Just to be clear, the idea is not that I quickly shut off the AI before it can torture simulated Eliezers; it could have already done so in the past, as Wei Dai points out below. Rather, because in this situation I immediately perform an action detrimental to the AI (switching it off), any AI that knows me well enough to simulate me knows that there's no point in making or carrying out such a threat.

I am assuming that an agent powerful enough to put me in this situation can predict that I would behave this way.

Comment author: wedrifid 31 March 2010 04:32:13AM 2 points [-]

It is also potentialy serves decision-theoretic purposes. Much like a Dutchess choosing not to pay off her blackmailer. If it is assumed that a cheesecake maximiser has a reason to force you into such a position (rather than doing it himself) then it is not unreasonable to expect that the universe may be better off if Cheesy had to take his second option.

Comment author: byrnema 01 April 2010 12:47:25PM *  0 points [-]

I can't recall: do your views on consciousness have a dualist component? If consciousness is in some way transcendental (that is, as a whole somehow independent or outside of the material parts), then I understand valuing it as, for example, something that has interesting or unique potential.

If you are not dualistic about consciousness, could you describe why you value it more than cheesecake?

Comment author: JGWeissman 02 April 2010 01:17:42AM 0 points [-]

No, I am not a dualist.

If you are not dualistic about consciousness, could you describe why you value it more than cheesecake?

To be precise, I value positive conscious experience more than cheesecake, and negative conscious experience less than cheesecake.

I assign value to things according to how they are experienced, and consciousness is required for this experience. This has to do with the abstract properties of conscious experience, and not with how it is implemented, whether by mathematical structure of physical arrangements, or by ontologically basic consciousness.

Comment author: Matt_Simpson 31 March 2010 01:35:21AM *  1 point [-]

me

(i'm assuming I'll be broken down as part of the tiling process, so this preserves me)

Comment author: wedrifid 31 March 2010 04:53:01AM 2 points [-]

Damn. If only I was simple, I could preserve myself that way too! ;)

Comment author: Rain 05 April 2010 03:57:55PM 0 points [-]

Isn't the universe already tiled with something simple in the form of fundamental particles?

Comment author: JGWeissman 05 April 2010 04:37:02PM 1 point [-]

In a tiled universe, the universe is partitioned into a grid of tiles, and the same pattern is repeated exactly in every tile, so that if you know what one tile looks like, you know what the entire universe looks like.

Comment author: Jack 31 March 2010 04:35:56AM *  0 points [-]

A sculpture of stars, nebulae and black holes whose beauty will never be admired by anyone.

ETA: If this has too little entropy to count as simple--- well whatever artwork I can get away with I'll take.

Comment author: Kevin 31 March 2010 01:31:37AM 0 points [-]

Computronium

Comment author: wedrifid 31 March 2010 04:24:18AM 0 points [-]

Witty comics. (eg)