If you want impact, use the narrative fallacy. What I mean is, use all of the other biases and fallacies you listed - tell a story about John, the guy who met a cool scientist guy when he was in primary school and now his life goal is to be a scientist. He decides to do work on global warming because 'what could be more important than this issue?' He expects to live in the city, be the head of a big lab... But he's not very good at global warming science (maybe he's not very good at research?), and he doesn't seem to notice that the advice his colleagues give him isn't helping. So he sticks to his guns because he's already got a degree in global warming, but he's always stressing about not having a job...
And so on.
And then rewind. John discovers rationality when he's a young adult, and becomes John-prime. Compare John to John-prime, whose rationality training allows him to recognise the availability bias at work on his dream of being a scientist, and since scholarship is a virtue, he researches, interviews... discovers that politics is a much better fit! His rationality informs him that the most important thing is improving quality of life, not global warming or power, so he donates to third-world charities and ensures when he runs for political positions he does so on a platform of improving social welfare and medical access. His rationality lets him evaluate advice-givers, and he manages to see through most of the self-serving advice - and when he finds a mentor who seems genuine, he sticks to that mentor, improving his success in politics...
And so on.
(And then the punchline: explain why this story makes the audience feel like rationality is important with a description of the narrative bias!)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
A currently living person doesn't want to die, but a potentially living person doesn't yet want to live, so there's an asymmetry between the two scenarios.
I agree, and that's why my intuition pushes me towards Life Extension. But how does that fact fit into utilitarianism? And if you're diverging from utilitarianism, what are you replacing it with?