I'm trying to develop a large set of elevator pitches / elevator responses for the two major topics of LW: rationality and AI.
An elevator pitch lasts 20-60 seconds, and is not necessarily prompted by anything, or at most is prompted by something very vague like "So, I heard you talking about 'rationality'. What's that about?"
An elevator response is a 20-60 second, highly optimized response to a commonly heard sentence or idea, for example, "Science doesn't know everything."
Examples (but I hope you can improve upon them):
"So, I hear you care about rationality. What's that about?"
Well, we all have beliefs about the world, and we use those beliefs to make decisions that we think will bring us the most of what we want. What most people don't realize is that there is a mathematically optimal way to update your beliefs in response to evidence, and a mathematically optimal way to figure out which decision is most likely to bring you the most of what you want, and these methods are defined by probability theory and decision theory. Moreover, cognitive science has discovered a long list of predictable mistakes our brains make when forming beliefs and making decisions, and there are particular things we can do to improve our beliefs and decisions. [This is the abstract version; probably better to open with a concrete and vivid example.]
"Science doesn't know everything."
As the comedian Dara O'Briain once said, science knows it doesn't know everything, or else it'd stop. But just because science doesn't know everything doesn't mean you can use whatever theory most appeals to you. Anybody can do that, and use whatever crazy theory they want.
"But you can't expect people to act rationally. We are emotional creatures."
But of course. Expecting people to be rational is irrational. If you expect people to usually be rational, you're ignoring an enormous amount of evidence about how humans work.
"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."
But of course. You have to weigh the cost of new information with the expected value of that new information. Sometimes it's best to just act on the best of what you know right now.
"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"
But of course. We even have lots of data on which situations are conducive to intuitive judgment, and which ones are not. And sometimes, it's rational to use your intuition because it's the best you've got and you don't have time to write out a bunch of probability calculations.
"But I'm not sure an AI can ever be conscious."
That won't keep it from being "intelligent" in the sense of being very good at optimizing the world according to its preferences. A chess computer is great at optimizing the chess board according to its preferences, and it doesn't need to be conscious to do so.
Please post your own elevator pitches and responses in the comments, and vote for your favorites!
One of the problems is that you say things like "I've been rational for years". Sorry. No, you haven't. EY hasn't been rational for years. You may have been an aspiring rationalist, but that's a far cry from actually being rational. When you say things like that it is extremely off-putting because it sounds self-congratulatory. That's something that this community struggles with a lot, and we typically heavily downvote things that are that way because they send very bad signals about what this website is. Beyond that, when it's said by someone with the username "911truther", it implies an element of "You're not rational unless you're a truther too", which mean it or not, is how it comes across.
Secondly, and this relates, your username. It's inherently political, which brings up all of our opposition to politics every time you make a post. That's not a good thing, and it will be very difficult for anyone on this site to take you seriously. If two different people wrote two articles that were of exactly equal caliber, and one was named BobSmith, and the other was named Obama2012, I would anticipate at least 2-3 times the upvoting on the former and 2-3 times the downvoting on the latter. And 9/11 is so much more of a polarizing issue. The vast, vast majority of people here disagree with you. But roland, despite being wildly downvoted every time he brings up 9/11, actually manages positive karma, because it's not inherently brought up every time he posts. I can not recommend strongly enough that you delete your account and create a new username if you wish to continue on this site. If you're a 911 truther, I would not suggest lying about that, but choosing that as the phrase by which you identify yourself is not a very effective strategy for being taken seriously on this site.
Thirdly, the great grandparent to this isn't a terrible comment. I agree with you there. I likely would have upvoted it had it been made by a different username, since I didn't think it deserved that level of downvoting (but not because I thought it was particularly wonderful in and of itself).