Possible absolute shite ahead (I went the folksy route):
"So, I hear you care about rationality. What's that about?"
It's about being like Brad Pitt in Moneyball. (Oh, you didn't see it? Here's a brief spoiler free synopsis). It's the art seeing how others, and even yourself, are failing and then doing better.
"Science doesn't know everything."
Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.
"But you can't expect people to act rationally. We are emotional creatures."
Yeah, no that's true. We've recently seen all kinds of bad decisions--housing crisis and so on. But that's all the more reason to try and get people to act more rationally.
"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."
Yeah, true... true. Still, we can prepare in advance for those situations. For example, you might have reason to believe that you're going to start a new project at your job. That's going to involve a lot of decisions and any poor decision at such an early stage can magnify as times goes by. That's why you prepare the best you can for those quick decisions that you know you'll be making.
"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"
Yeah, intuitions are just decisions based on experience. I remember reading that chess masters, y'know like Billy Fisher or Kasparov, don't even deliberate on their decisions, they just know; whereas, chess experts, a level below master, do deliberate. But to get to that level of mastery, you need tens of thousands of hours of practice, man. Only a few of us are lucky enough to have that kind of experience in even a very narrow area. If you're a something like an intermediate chess player in an area with a bunch of skilled chess players, your intuition is going to suck.
"But I'm not sure an AI can ever be conscious."
Maybe not, but that's not really important. Did you hear about Watson? That machine that beat those Jeopardy players? They're saying Watson could act as a medical diagnostician like House and do a better job at it. Not only that, but it'd be easier than playing jeopardy... isn't that crazy?
Oh, yeah, I completely agree. But, it does know a helluva lot. It put us on the moon, gave us amazing technology like this [pull out your cellphone], and there's every reason to think it's going to blow our minds in the future.
I like the others, but I think the problem with this one is that it doesn't provide them with any reason why they shouldn't just fill the gaps in whatever science knows now with whatever the hell they want.
I'm trying to develop a large set of elevator pitches / elevator responses for the two major topics of LW: rationality and AI.
An elevator pitch lasts 20-60 seconds, and is not necessarily prompted by anything, or at most is prompted by something very vague like "So, I heard you talking about 'rationality'. What's that about?"
An elevator response is a 20-60 second, highly optimized response to a commonly heard sentence or idea, for example, "Science doesn't know everything."
Examples (but I hope you can improve upon them):
"So, I hear you care about rationality. What's that about?"
"Science doesn't know everything."
"But you can't expect people to act rationally. We are emotional creatures."
"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."
"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"
"But I'm not sure an AI can ever be conscious."
Please post your own elevator pitches and responses in the comments, and vote for your favorites!