I assume everyone has run across at least one of the "Shit X's Say" format of videos? Such as Shit Skeptics Say. When done right it totally triggers the in-group warm-fuzzies. (Not to be confused with the nearly-identically formatted "Shit X's Say to Y's" which is mainly a way for Y's to complain about X's).
What sort of things do Rationalists often say that triggers this sort of in-group recognition which could be popped into a short video? A few I can think of...
You should sign up for cryonics. I want to see you in the future.
…intelligence explosion…
What’s your confidence interval?
You know what they say: one man’s Modus Ponens is another man’s Modus Tollens
This may sound a bit crazy right now, but hear me out…
What are your priors?
When the singularity comes that won’t be a problem anymore.
I like to think I’d do that, but I don’t fully trust myself. I am running on corrupted hardware after all.
I want to be with you, and I don’t foresee that changing in the near future.
…Bayesian statistics…
So Omega appears in front of you…
What would you say the probability of that event is, if your beliefs are true?
Others?
This is too much fuuuuuuuun
"She's just signaling virtue."
"Money is the unit of caring."
"One-box!"
"Beliefs should constrain anticipations."
"Existential risk..."
"I'll cooperate if and only if the other person will cooperate if and only if I cooperate."
"I'm going to update on that."
"Tsuyoku naritai!"
"My utility function includes a term for the fulfillment of your utility function."
"Yeah, it's objective, but it's subjectively objective."
"I am a thousand shards of desire."
"Whoa, there's an inferential gap here that one of us is failing to bridge."
"My coherent extrapolated volition says..."
"Humans aren't agents." ("I'm trying to be more agenty." "Humans don't really have goals.")
"Wait, wait, this is turning into an argument about definitions."
"Look, just rejecting religion and astrology doesn't make someone rational."
"No, no, you shouldn't implement Really Extreme Altruism. Unless the alternative is doing it without, anyway..."
"I'll be the Gatekeeper, you be the AI."
"That's Near, this is Far."
"Don't fall into bottom-line thinking like that."
"My utility function includes a term for the fulfillment of your utility function."
Awww.... :)