lukeprog comments on Do Humans Want Things? - Less Wrong

23 Post author: lukeprog 04 August 2011 05:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (52)

You are viewing a single comment's thread. Show more comments above.

Comment author: Sniffnoy 04 August 2011 05:42:57AM 10 points [-]

But as far as we can tell, our behavior is often not determined by our wanting a particular state of affairs, but by how our options are framed.

Moreover, neurons in the parietal and obritofrontal corticies encode value in a reference-dependent way — that is, they do not encode value for objective states of affairs.

I'm not certain your examples of reference-dependent encoding of sense-data really demonstrate or have much to do with a lack of objective goals. (Of course, the framing effect example demonstrates this plenty well. :P ) As you point out, this is largely just adjusting for irrelevant background, like whether the sun is out, when what we care about has nothing to do with that. This is just throwing away the information at an early stage, rather than later after having explicitly determined that it's irrelevant to our goals.

Comment author: lukeprog 04 August 2011 11:06:53AM 2 points [-]

I agree that the framing effect is more important than the reference-dependence of sense-data encoding. However, the loss of sense-data is not always just "adjusting for irrelevant background", and is not always throwing away something we would later have decided is "irrelevant to our goals."

Comment author: Kaj_Sotala 05 August 2011 07:47:00AM 6 points [-]

When I first read the post, I thought you were going to say something along the lines of:

"Evolution has optimized us to strip away the irrelevant features when it comes to vision, since it's been vital for our survival. But evolution hasn't done that for things like abstract value, since there's been no selection pressure for that. It's bad that our judgments in cases like the K&T examples don't work more like vision, but that's how it goes".

Indeed, saying "let's make the problem worse" and then bringing up vision feels a bit weird. After all, vision seems like a case where our brain does things exactly right - it ignores the "framing effects" caused by changed lightning conditions and leaves invariant the things that actually matter.

Comment author: lukeprog 10 August 2011 11:25:25PM 1 point [-]

I wrote a response here.

Comment author: shminux 04 August 2011 03:47:08PM 3 points [-]

An illuminating (no pun intended) example of when the adjustment to the ambient level of sense-data affects what people think they want would be nice. Without it the whole section seems to detract from your point.

Comment author: lukeprog 10 August 2011 11:25:36PM 0 points [-]

I wrote a response here.

Comment author: lukeprog 08 August 2011 01:29:31AM 0 points [-]

But I'm not raising a puzzle about how people think they want things even when they are behavioristic machines. I'm raising a puzzle about how we can be said to actually want things even when they are behavioristic machines that, for example, exhibit framing effects and can't use neurons to encode value for the objective intensities of stimuli.