CalmCanary comments on Hedonium's semantic problem - LessWrong

12 Post author: Stuart_Armstrong 09 April 2015 11:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (61)

You are viewing a single comment's thread.

Comment author: CalmCanary 09 April 2015 11:56:32PM 1 point [-]

Very interesting post, but your conclusion seems too strong. Presumably, if instead of messing around with artificial experiencers, we just fill the universe with humans being wireheaded, we should be able to get large quantities of real pleasure with fairly little actually worthwhile experiences; we might even be able to get away with just disembodied human brains. Given this, it seems highly implausible that if we try to transfer this process to a computer, we are forced to create agent so rich and sophisticated that their lives are actually worth living.

Comment author: Stuart_Armstrong 10 April 2015 10:39:37AM 1 point [-]

we just fill the universe with humans being wireheaded, we should be able to get large quantities of real pleasure with fairly little actually worthwhile experiences

By this argument, we might not. If the wireheaded human beings never have experiences and never access their memories, in what way do they remain human beings? ie if we could lobotomise them without changing anything, are they not already lobotomised?

Comment author: Manfred 12 April 2015 05:03:33PM 1 point [-]

if we could lobotomise them without changing anything, are they not already lobotomised?

Very unseriously: Of course not, because if they were already lobotomized we wouldn't be able to lobotomize them. :P