Since high school I've been involved in conworlding - collaborative development of fictional worlds and societies, then setting stories or games in them.
Around 2005, I and some friends set a story in a culture with a goddess named Per married to a god named Elith. The religion gets called "Perelithve".
Skip to 2008. Neil Stephenson publishes Anathem, One throwaway reference mentions two of the avout, a woman named Per who marries a man named Elith. The marriage rite they invent gets called "Perelithian"
If names can have between 3 and 8 letters, and always alternate vowels and consonants, and ''th' counts as one sound, I calculate that the chances of someone who needs two names coming up with "Per" and "Elith" is on the order of one in a billion. The similarity in stories maybe adds another two or three bits of unlikelihood. If I've read 1000 novels, each of which has 100 minor characters, and my conworlds contain 1000 characters, then the odds aren't really that bad, maybe as high as 1%
Still freaks me out, though.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I don't think you have to be a moral anti-realist to believe the orthogonality thesis but you certainly have to be a moral realist to not believe it.
Now if you're a moral realist and you try to start writing an AI you're going to quickly see that you have a problem.
Doesn't work. So you have to start defining "morality" any you figure out pretty quickly that no one has the least idea how to do that in a way that doesn't rapidly lead to disastrous consequences. You end up with the only plausible option looking like : "Examine what humans would want if they were rational and had all the information you have". Seems to me that that is the moment you should just become a moral subjectivist -- maybe of the ideal observer theory variety.
Now you might just believe the orthogonality thesis because you are a moral realist who doesn't believe in motivational internalism-- they're lots of ways to get there. But you can't be an anti-realist and ever even come close to making such a mistake.
No, because it's possible that there genuinely is a possible total ordering, but that nobody knows how to figure out what it is. "No human always knows what's right" is not an argument against moral realism, any more than "No human knows everything about God" is an argument against theism.
(I'm not a moral realist or theist)