shminux comments on [SEQ RERUN] The Design Space of Minds in General - Less Wrong

2 Post author: MinibearRex 14 June 2012 04:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (18)

You are viewing a single comment's thread.

Comment author: shminux 14 June 2012 05:40:05AM *  -2 points [-]

My primary moral is to resist the temptation to generalize over all of mind design space

If we focus on the bounded subspace of mind design space which contains all those minds whose makeup can be specified in a trillion bits or less, then every universal generalization that you make has two to the trillionth power chances to be falsified.

That's what I think every time someone brings up the idea of tortured sims. What are the odds of it happening?

Comment author: DanielLC 14 June 2012 06:18:37AM 3 points [-]

I'd say somewhere in the vicinity of 100%.

I'm just hoping euphoric sims will be more common.

Comment author: stcredzero 14 June 2012 07:29:33PM 0 points [-]

I'm just hoping euphoric sims will be more common.

What if all sims are tortured sims, with the difference being in the degree of torture?

Comment author: DanielLC 14 June 2012 07:50:29PM 0 points [-]

That would suck. Why?

Comment author: stcredzero 14 June 2012 09:47:51PM *  0 points [-]

Well, you could consider our reality to be a devious torture sim where we are allowed moments of happiness, joy, and even euphoria, but those are interspersed with boredom and pain and everything is doomed to eventual decrepitude and death.

I suspect sims where sometimes things suck, and they suck a little more than they're awesome on the balance, predominate in the set of conceivable self-consistent universes.

Comment author: [deleted] 14 June 2012 05:57:03AM 1 point [-]

Do you have a particular formal specification of a tortured mind in mind?

Comment author: shminux 14 June 2012 06:37:25PM *  -1 points [-]

Not sure what you are asking. My point was that the human notion of torture is apriori a tiny speck in the ocean of possible Turing machines. We don't know nearly enough at this point to worry about accidental or intentional sim torture, so we shouldn't, until at least a ballpark estimate with a few sigma confidence interval can be computed. This is a standard cognitive bias people fall into here and elsewhere: a failure of imagination. In this particular case someone brings up the horrifying possibility of tortured and killed 3^^^3 sims, and the emotional response to it is strong enough to block any rational analysis of the odds of it happening. Also reminds me of people conjuring aliens as human-looking, human-thinking and, of course, emotionally and sexually compatible. EY wrote about how silly this is at some length.

Comment author: [deleted] 14 June 2012 07:18:42PM 1 point [-]

Not sure what you are asking.

Is it not clear that in order to calculate the probability of any proposition, you need an actual definition of the proposition at hand?

My point was that the human notion of torture is apriori a tiny speck in the ocean of possible Turing machines. We don't know nearly enough at this point to worry about accidental or intentional sim torture, so we shouldn't, until at least a ballpark estimate with a few sigma confidence interval can be computed.

I think we agree that the only currently feasible arguments for any given value of P(a mind in such and such a mindspace is being tortured) are those based on heuristics.

However, you say these minds constitute "apriori a tiny speck", and I do not endorse such a statement (given any reasonable definition of torture), unless you have some unstated, reasonable, heuristic reason for believing so. Ironically, "failure of imagination" is frequently a counterargument to people arguing that a certain reference class is a priori very small.

Comment author: shminux 14 June 2012 08:32:17PM -1 points [-]

Is it not clear that in order to calculate the probability of any proposition, you need an actual definition of the proposition at hand?

My only reason is of the Pascal's wager-type: you pick one possibility (tortured sims) out of unimaginably many, without providing any estimate of its abundance in the sea of all possibilities, why privilege it?

Comment author: MinibearRex 14 June 2012 10:11:49PM 0 points [-]

I don't think most people talking about torture vs dust specks actually expect it to happen. And even if it actually could happen, it might be a smart idea to precommit to refuse to play any crazy games with an intelligence that wants to torture people. The point of the discussion is ethics. It's a thought experiment. It's not actually going to happen.

Comment author: shminux 14 June 2012 11:12:13PM 0 points [-]

Not sure why you are bringing up specks vs torture, must be some misunderstanding.

Comment author: MinibearRex 15 June 2012 04:50:24AM 0 points [-]

It was the line about torturing and killing 3^^^3 sims. It seemed like you were referencing all of the various thought experiments people have discussed here involving that number. I only mentioned torture vs specks, but the point is the same. I don't think anyone ever actually expects something to happen in real life that involves the number 3^^^3.

Comment author: wedrifid 14 June 2012 06:55:42PM 1 point [-]

My primary moral is to resist the temptation to generalize over all of mind design space

That's what I think every time someone brings up the idea of tortured sims. What are the odds of it happening?

The odds of that happening are almost entirely unrelated to the proportion of mind-space such torture machines make up. That kind of 'torture AI' is something that would be created - either chosen out of mind space deliberately or chosen by making a comparatively tiny error when aiming for a Friendly AI. It isn't the sort of thing that is just randomly selected.