You can have some fun with people whose anticipations get out of sync with what they believe they believe.
I was once at a dinner party, trying to explain to a man what I did for a living, when he said: "I don't believe Artificial Intelligence is possible because only God can make a soul."
At this point I must have been divinely inspired, because I instantly responded: "You mean if I can make an Artificial Intelligence, it proves your religion is false?"
He said, "What?"
I said, "Well, if your religion predicts that I can't possibly make an Artificial Intelligence, then, if I make an Artificial Intelligence, it means your religion is false. Either your religion allows that it might be possible for me to build an AI; or, if I build an AI, that disproves your religion."
There was a pause, as the one realized he had just made his hypothesis vulnerable to falsification, and then he said, "Well, I didn't mean that you couldn't make an intelligence, just that it couldn't be emotional in the same way we are."
I said, "So if I make an Artificial Intelligence that, without being deliberately preprogrammed with any sort of script, starts talking about an emotional life that sounds like ours, that means your religion is wrong."
He said, "Well, um, I guess we may have to agree to disagree on this."
I said: "No, we can't, actually. There's a theorem of rationality called Aumann's Agreement Theorem which shows that no two rationalists can agree to disagree. If two people disagree with each other, at least one of them must be doing something wrong."
We went back and forth on this briefly. Finally, he said, "Well, I guess I was really trying to say that I don't think you can make something eternal."
I said, "Well, I don't think so either! I'm glad we were able to reach agreement on this, as Aumann's Agreement Theorem requires." I stretched out my hand, and he shook it, and then he wandered away.
A woman who had stood nearby, listening to the conversation, said to me gravely, "That was beautiful."
"Thank you very much," I said.
Part of the sequence Mysterious Answers to Mysterious Questions
Next post: "Professing and Cheering"
Previous post: "Belief in Belief"
I am quite impressed at your capability of signaling your prodigious intelligence. Less pompously, moments like that make for fond memories.
How to interpret the comment above? Is it suggesting that EY's behavior was pompous? (As of this writing, the commenter only made one comment, this one, and does not seem to be around LessWrong at this time.) My take: >60% likely. Going "one level up", I would expect a majority of readers would at least wonder.
EY views other people’s irrationality as his problem, and it seems to me this discussion demonstrates a sincere effort to engage with someone he perceived as irrational. The conversation was respectful; as it progressed, each person clarified what they meant, and they ended with a handshake. If I were there at the outset of the conversation, I would not have expected this good of an outcome. (Updated on 2024-May-1)