TheAncientGeek comments on Open thread, Jul. 25 - Jul. 31, 2016 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (133)
Well, in the normal course of life, on the object level, some things are more probable than others.
If you push me about if I REALLY know they're true, then I admit that my reasoning and data could be confounded by a Matrix or whatever.
Maybe it's clearer like so:
Colloquially, I know how to judge relative probabilities.
Philosophically (strictly), I don't know the probability that any of my conclusions are true (because they rest on concepts I don't pretend to know are true).
About the moral values thing, it sounds kinda like you haven't read the sequence on metaethics. If not, then I'm glad to be the one to introduce you to the idea, and I can give you the broad strokes in a few sentences in a comment, but you might want to ponder the sequence if you want more.
Morality is a set of things humans care about. Each person has their own set, but as humans with a common psychology, those sets greatly overlap, creating a general morality.
But, humans don't have access to our source code. We can't see all that we care about. Figuring out the specific values, and how much to weight them against each other is just the old game of thought experiments and considering trade-offs, etcetera.
Nothing that can be reduced to some one-word or one-sentence idea that sums it all up. So we don't know what all the values are or how they're weighted. You might read about "Coherent Extrapolated Volition," if you like.
Morality is not arbitrary any more than circularity is arbitrary. Both refer to a specific thing with specific qualities. If you change the qualities of the thing, that doesn't change morality or change circularity, it just means that the thing you have no longer has morality, no longer has circularity.
A great example is Alexander Wales' short story "The Last Christmas" (particularly chapter 2 and 3). See below.
The elves care about Christmas Spirit, not right and wrong, or morality, or fairness.
When it's pointed out that what they're doing isn't fair, they don't protest, they just say "We don't care. Fairness isn't part of the Christmas Spirit."
And we might say, "Santa being fat? We don't care, that's not part of morality. We don't deny that it's part of the Christmas Spirit; we just don't care that it is."
If aliens care about different things, it's not about our morality versus "their" morality. It would be about THE morality versus THE Glumpshizzle. The paper-clipper is used also as example. It doesn't care about morality. It cares about clippiness.
The moral thing and the clippy thing to do are both fixed calculations. Once you know the answer, it's a feature of your mind if you happen to respond to morality, or clippiness, or Glumpshizzle, or Christmas Spirit.
If anybody thinks I've misunderstood part of this, please, do let me know. I've tried to understand, and would like to correct any mistakes if I have them.
“You wouldn’t even make any arguments for why you should live?” asked Charles.
“My life is meaningless in the face of the Christmas spirit,” said Matilda.
“But if it didn’t matter to the Christmas spirit,” said Charles, “If I just wanted to see you die for fun?”
“Allowing you to satisfy your desires is part of maintaining the Christmas spirit, Santa,”
“It’s unfair,” said Charles.
“Life is unfair,” said Matilda.
“Does it have to be?” asked Charles. “Is that the Christmas spirit?”
“I don’t know,” said Matilda. “Fairness doesn’t enter into it, I don’t think. Why should Christmas be fair if life isn’t fair?”
http://alexanderwales.com/the-last-christmas-chapter-1-2/
Again. my point is that it that to do justice to philosophical doubt, you need to avoid high probabilities in practical reasoning a laTaleb. But not everyone gets that. A lot of people think that using probability alone us sufficient.