Here's the new thread for posting quotes, with the usual rules:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself
- Do not quote comments/posts on LW/OB
- No more than 5 quotes per person per monthly thread, please.
When I am speaking to people about rationality or AI, and they ask something incomprehensibly bizarre and incoherent, I am often tempted to give the reply that Charles Babbage gave to those who asked him whether a machine that was given bad data would produce the right answers anyway:
But instead I say, "Yes, that's an important question..." and then I steel-man their question, or I replace it with a question on an entirely different subject that happens to share some of the words from their original question, and I answer that question instead.
What does this mean?