You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lumifer comments on Summoning the Least Powerful Genie - Less Wrong Discussion

-1 Post author: Houshalter 16 September 2015 05:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (48)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 18 September 2015 08:38:25PM *  0 points [-]

Notice the difference (emphasis mine):

A program designed to answer a question necessarily wants to answer that question

vs

...it becomes more predictive to think of it as wanting things

Comment author: VoiceOfRa 20 September 2015 08:42:13PM 2 points [-]

Well, the fundamental problem is that LW-style qualiafree-rationalism has no way to define what the word "want" means.

Comment author: lmm 20 September 2015 06:40:43PM -1 points [-]

Is there a difference between "x is y" and "assuming that x is y generates more accurate predictions than the alternatives"? What else would "is" mean?

Comment author: Lumifer 21 September 2015 03:08:49PM 1 point [-]

Is there a difference between "x is y" and "assuming that x is y generates more accurate predictions than the alternatives"? What else would "is" mean?

<boggle> Are you saying the model with the currently-best predictive ability is reality??

Comment author: lmm 25 September 2015 06:51:48AM -1 points [-]

Not quite - rather the everyday usage of "real" refers to the model with the currently-best predictive ability. http://lesswrong.com/lw/on/reductionism/ - we would all say "the aeroplane wings are real".

Comment author: Lumifer 25 September 2015 02:40:37PM *  1 point [-]

rather the everyday usage of "real" refers to the model with the currently-best predictive ability

Errr... no? I don't think this is true. I'm guessing that you want to point out that we don't have direct access to the territory and that maps is all we have, but that's not very relevant to the original issue of replacing "I find it convenient to think of that code as wanting something" with "this code wants" and insisting that the code's desires are real.

Anthropomorphization is not the way to reality.