Wei_Dai comments on To what degree do we have goals? - Less Wrong

45 Post author: Yvain 15 July 2011 11:11PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (52)

You are viewing a single comment's thread.

Comment author: Wei_Dai 16 July 2011 01:34:12AM 5 points [-]

The post cites being upset or angry as evidence of certain apparent preferences being closer to genuine preferences, but a paperclip maximizer wouldn't get upset or angry if a supernova destroyed some of its factories, for example. I think being upset or angry when one's consciously held goals have been frustrated is probably just a signaling mechanism, and not evidence of anything beyond the fact that those goals are consciously held (or "approved" or "endorsed").

Comment author: Armok_GoB 16 July 2011 12:22:17PM 8 points [-]

If a staple maximizer came in with a ship and stole some of the paperclip factories for remaking into staple factories, the paperclipper would probably expend resources to take revenge for game theoretical reasons, even if this cost paperclips.

Comment author: Nebu 11 December 2015 06:43:37AM 0 points [-]

I think this argument is misleading.

Re "for game theoretical reasons", the paperclipper might take revenge if it predicted that doing so would be a signalling-disincentive for other office-supply-maximizers from stealing paperclips. In other words, the paperclip-maximizer is spending paperclips to take revenge solely because in its calculation, this actually leads to the expected total number of paperclips going up.

Comment author: Armok_GoB 15 December 2015 10:19:20PM 1 point [-]

That assumes the scenario is iterated, I'm talking it'd precomit to do so even in a one-of scenario. The resxzt of you argument was my point, that the same reasoning goes for anger.

Comment author: wedrifid 16 July 2011 04:07:58PM *  3 points [-]

but a paperclip maximizer wouldn't get upset or angry if a supernova destroyed some of its factories, for example.

I probably wouldn't either. It sounds like the sort of amortized risk that I would have accounted for when I spread the factories out through thousands of star systems. The anger would come in only when the destruction was caused by another optimising entity. And more specifically by another entity that I have modelled as 'agenty' and not one that I have intuitively objectified.

Comment author: wedrifid 16 July 2011 04:11:52PM *  4 points [-]

I think being upset or angry when one's consciously held goals have been frustrated is probably just a signaling mechanism,

The signalling element is critical but I can't agree that they are just signalling. Those emotions also serve to provoke practical changes in behaviour.

Comment author: [deleted] 16 July 2011 03:48:04AM 2 points [-]

Who is meant to receive the signal sent by anger from a goal thwarted? My impression is that people try to keep a lid on such frustration, e.g. because it might make them appear childish.

Comment author: Will_Sawin 16 July 2011 02:19:42PM *  1 point [-]

Was this different in EEA?

Comment author: arundelo 17 July 2011 03:38:49AM 1 point [-]

Do you mean EEA?

Comment author: Will_Sawin 17 July 2011 05:38:25AM 0 points [-]

Yes.

I believe I mashed up that acronym and the phrase "ancestral environment" to end up with "AEE", but I'm not sure.

Comment author: [deleted] 16 July 2011 04:03:41PM 1 point [-]

I don't understand.

Comment author: Will_Sawin 17 July 2011 03:15:10AM 3 points [-]

Sorry.

One would expect that behavior, e.g. emotional responses we need to keep a lid on, that is maladaptive now would be better-suited to the environment we evolved in. For instance, overeating shows this pattern.

So I'm suggesting that anger is a signaling mechanism that is sometimes faulty now, and sends signals we don't want to send. However, it evolved to send signals that were good in that environment.

This is not necessarily the case. Evolution could not perfectly control the signals we send - there are situations where we do Y even though, even in the evolutionary environment, X would be more advantageous.

Comment author: timtyler 16 July 2011 11:46:46AM *  0 points [-]

I think being upset or angry when one's consciously held goals have been frustrated is probably just a signaling mechanism [...]

Angry, probably. Upset, probably not.