scaphandre comments on Gains from trade: Slug versus Galaxy - how much would I give up to control you? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (67)
I recognise that argument, but surely we can use consideration of utility function in models in order to make progress along thinking about these things.
Even if we crudely imagine a typical human who happens to be ticking all Mazlow's boxes with access to happiness, meaning and resources tending to be more towards our (current...) normalised '1' and someone in solitary confinement, in psychological torture, tending towards our normalised '0' as a utility point – even then the concept is sufficiently coherent and grokable to allow use of these kinds of models?
Do you disagree? I am curious – I have encountered this point several times and I'd like to see where we differ.