Lumifer comments on Rationality Quotes August 2013 - Less Wrong

7 Post author: Vaniver 02 August 2013 08:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (733)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 06 August 2013 06:26:51PM 1 point [-]

It's not about weight, it's about an absolute, discontinuous, hard limit -- regardless of how many utilons you can pile up on the other end of the scale.

Comment author: Bayeslisk 06 August 2013 08:22:28PM 2 points [-]

Well, no. It's against the promise of how many utilons you can pile up on the other arm of the scale, which may well not pay off at all. I'm reminded of a post here at some point whose gist was "if your model tells you that your chances of being wrong are 3^^^3:1 against, it is more likely that your model is wrong than that you are right."

Comment author: AndHisHorse 06 August 2013 08:34:21PM 1 point [-]

Yes, but the quote in no way concerns itself with the probability that such a plan will go wrong; rather, it explicitly includes even those with a wide margin of error, including "every" plan which ends in murder and children crying.

Comment author: Decius 07 August 2013 05:55:42PM 2 points [-]

If your plan ends in murder and children crying, what happens if your plan goes wrong?

Comment author: linkhyrule5 10 August 2013 01:53:42AM 1 point [-]

If your plan requires you to get into a car with your family, what happens if you crash?

Comment author: SaidAchmiz 10 August 2013 02:24:07AM 1 point [-]

Well, getting into a car with your family is not inherently bad, so it's not a very good parallel... but if your overall point is that "expected value calculations do not retroactively lose mathematical validity because the world turned out a certain way", then that's definitely true.

I think that the "what if it all goes wrong" sort of comment is meant to trigger the response of "oh god... it was all for nothing! Nothing!!!". Which is silly, of course. We murdered all those people and made those children cry for the expected value of the plan. Complaining that the expected value of an action is not equal to the actual value of the outcome is a pretty elementary mistake.

Comment author: Decius 10 August 2013 02:19:05AM 0 points [-]

The features of my plan which mitigate the result of the plan going wrong kick in, and the damage is mitigated. I don't go on vacation, despite the nonrefundable expenses incurred. The plan didn't end in death and sadness, even if a particular implementation did.

When the plan ends in murder and children crying, every failure of the plan results in a worse outcome.

Comment author: wedrifid 10 August 2013 02:27:54AM *  0 points [-]

When the plan ends in murder and children crying, every failure of the plan results in a worse outcome.

This does not seem to follow. Failure of the plan could easily involve failure to cause the murder or crying to happen for a start. Then there is the consideration that an unspecified failure has completely undefined behaviour. Anything could happen, from extinction or species-wide endless torture to the outright creation of a utopia.

Comment author: glomerulus 10 August 2013 02:41:16AM *  2 points [-]

For most people, murder and children crying are a bad outcome for a plan, but if they're what the planner has selected as the intended outcome, the other probable outcomes are presumably worse. Theoretically, the plan could "fail" and end in an outcome with more utilons than murder and children crying, but those failures are obviously improbable: because if they weren't, then the planner would presumably have selected them as the desired plan outcome.

Comment author: Decius 10 August 2013 03:13:46AM 0 points [-]

Or at least have the foresight to see that they have become likely and alter the plan such that it now results in utopia instead of murder.

Comment author: Decius 10 August 2013 03:12:43AM 0 points [-]

I think we need to examine what we mean by 'fail'.

A plan does not fail simply because the actual outcome is different from the outcome judged most likely; a plan fails when a contingency not prepared for occurs which prevents the intended outcome from being realized, or when an explicit failure state of the plan is reached.

If I plan to go on a vacation and prepare for a major illness by deciding that I will cancel the vacation, then experiencing a major illness might cause the plan to fail- because I have identified that as a failure state. The more important the object of the plan, the harder I will work in the planning stage to minimize the likelihood of ending up in a failure state. (When sending a probe to Mars, for example, I want to be prepared such that everything I can think of that might go wrong along the way still yields a success condition.)

Comment author: Document 10 August 2013 02:55:05AM 1 point [-]

The murder and children crying fail to occur in the intended quantity?

Comment author: Bayeslisk 06 August 2013 08:36:22PM 1 point [-]

It's not a matter of "the plan might go wrong", it's a matter of "the plan might be wrong", and the universal part comes from "no, really, yours too, because you aren't remotely special."

Comment author: linkhyrule5 10 August 2013 01:54:35AM 0 points [-]
Comment author: Bayeslisk 10 August 2013 05:47:42AM 1 point [-]

Sounds about right to me.