lukeprog comments on Do Humans Want Things? - Less Wrong

23 Post author: lukeprog 04 August 2011 05:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (52)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 04 August 2011 11:06:53AM 2 points [-]

I agree that the framing effect is more important than the reference-dependence of sense-data encoding. However, the loss of sense-data is not always just "adjusting for irrelevant background", and is not always throwing away something we would later have decided is "irrelevant to our goals."

Comment author: Kaj_Sotala 05 August 2011 07:47:00AM 6 points [-]

When I first read the post, I thought you were going to say something along the lines of:

"Evolution has optimized us to strip away the irrelevant features when it comes to vision, since it's been vital for our survival. But evolution hasn't done that for things like abstract value, since there's been no selection pressure for that. It's bad that our judgments in cases like the K&T examples don't work more like vision, but that's how it goes".

Indeed, saying "let's make the problem worse" and then bringing up vision feels a bit weird. After all, vision seems like a case where our brain does things exactly right - it ignores the "framing effects" caused by changed lightning conditions and leaves invariant the things that actually matter.

Comment author: lukeprog 10 August 2011 11:25:25PM 1 point [-]

I wrote a response here.

Comment author: shminux 04 August 2011 03:47:08PM 3 points [-]

An illuminating (no pun intended) example of when the adjustment to the ambient level of sense-data affects what people think they want would be nice. Without it the whole section seems to detract from your point.

Comment author: lukeprog 10 August 2011 11:25:36PM 0 points [-]

I wrote a response here.

Comment author: lukeprog 08 August 2011 01:29:31AM 0 points [-]

But I'm not raising a puzzle about how people think they want things even when they are behavioristic machines. I'm raising a puzzle about how we can be said to actually want things even when they are behavioristic machines that, for example, exhibit framing effects and can't use neurons to encode value for the objective intensities of stimuli.