If rationality requires truth, and truth requires a motivation, can rationality exist as a motivation on its own?
It logically can exist as a motivation of its own, but a great many think that they have such motivation, far more than actually do. Even if one feels that one seeks truth for its own sake, it's probably not true.
I think I remember that Nietzsche did not believe it was possible.
Can rationality give a person happiness given that's their goal?
Rationality gives people different things depending on the person and their environment. The best way to predict what would happen in a hypothetical scenario is to be rational. Being able to predict things accurately probably causes more happiness than it prevents, for most. This is a mild side effect of rationality, things designed around happiness would have more of a chance of being good at affecting that (I suspect most basically fail and there are a few gems there).
My view that others, such as Eliezer, do not share is that rationality is much more related to losing than to winning. Rationality prevents people from making mistakes, this is only equivalent to winning and positively creating success if one goes on a significant not-losing streak.
So I'd say that if you are happy naturally, and unhappy when bad things happen to you, it will probably help a lot. If you are naturally unhappy, and need good things to happen to be happy, it won't make you happy at all, it will only lessen the frequency and severity of failures and problems. It helps one's net happiness but doesn't make one happy.
You're talking with someone you like, and they ask you what you mean by rationality, or why you keep going to LessWrong meetups. Or you meet someone who might be interested in the site.
What do you say to them? If you had to explain to someone what LW-style rationality is in 30 seconds, how would you do it? What's your elevator pitch? Has anyone had any success with a particular pitch?
My Current Pitch:
My current best one, made up on the spot, lacking any foreplanning, basically consists of:
"Basically, our brains are pretty bad at forming accurate beliefs, and bad in fairly systematic ways. I could show you one, if you want."
Playing the triplet game with them, then revealing that the numbers just need to be ascending
Upon failure, "Basically, your brain just doesn't look for examples that disprove your hypothesis, so you didn't notice that it could have a been a more general rule. There are a bunch of others, and I'm interested in learning about them so that I can correct for them."
My Thoughts on That:
It's massively effective at convincing people that cognitive biases exist (when they're in the 80% that fails, which has always been the case for me so far), but pretty much entirely useless as a rationality pitch. It doesn't explain at all why people should care about having accurate beliefs, and takes it as a given that that would be important.
It's also far too dry and unfun (compared to say, Methods), and has the unfortunate side effect of making people feel like they've gotten tricked. It makes it look non-cultish though.
I suspect that other people can do better, and I'll comment later with one that I actually put thought into. There's a pretty good chance that I'll use a few of the more upvoted ones and see how they go over.