Academian comments on Backchaining causes wishful thinking - Less Wrong

15 Post author: PhilGoetz 19 May 2010 07:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (18)

You are viewing a single comment's thread.

Comment author: Academian 19 May 2010 10:07:04PM 2 points [-]

The final goal of a plan is a belief, i.e. the belief that state X currently holds. In your representation, this might appear as "X", but semantically it's always "believe(state(X))

If that means what I think it does, I disagree. If you employ enough sense of intentionality to call something a "goal", then a self-referencing intelligence can refer to the difference between X obtaining and it believing X obtains, and choose not to wirehead itself into a useless stupor. This is what JGWeissman was getting at in Maximise Expected Utility, not Expected Perception of Utility.

Comment author: PhilGoetz 19 May 2010 11:26:37PM *  2 points [-]

I stated it poorly. Guess I better rewrite it. In the meantime, see my reply to Yvain below.

... time passes ...

I didn't rewrite it. I deleted it. That whole paragraph about believe(state(X)) contributed nothing to the argument. And, as you noted, it was wrong.

Comment author: orthonormal 20 May 2010 10:00:01PM 0 points [-]

With that paragraph deleted, it was difficult for me (just reading it now) to make the inference connecting your argument to wishful thinking. You might want to spell it out.

Comment author: PhilGoetz 22 May 2010 12:02:01AM 0 points [-]

I don't think it's because I deleted that paragraph. I think it was just unclear. I rewrote the second half.

Comment author: orthonormal 23 May 2010 05:18:00PM 0 points [-]

Much improved, and accordingly upvoted.

Comment author: apophenia 20 May 2010 09:57:00PM *  0 points [-]

I read this article after you deleted that paragraph, but I had basically the same objection reading "between the lines" of the rest of what you said.

Obviously, any animal that did something like this all the time would die. It's possible that doing it to a limited degree might really happen. Is there a way to test your hypothesis?

Comment author: PhilGoetz 22 May 2010 12:02:32AM 0 points [-]

What's the "something like this" in your sentence refer to?

Comment author: apophenia 24 May 2010 04:04:33AM *  0 points [-]

Replacing a belief that actually obtains i.e. food, with a belief that actions it is already taking (sitting in place) will obtain it food.