My school has a weekly event on Thursdays where someone can give a 15-25 minute lecture about a topic of their choice during the lunch break. The standard attendance is about 20-30, aged between 14 and 18, and some teachers drop by if the topic is related to their subject. It's heavily interlinked with the philosophy department, in that topics are typically about religion or ethics, so the audience is generally more philosophically informed than average. A good percentage are theists or deists, and there's a very high chance that the subject will be more thoroughly discussed in the philosophy club the day after.
In a previous lecture a few months ago I tried to explain some standard biases, the Map/Territory concepts, Bayes, and generally attempted to compress the core sequences into 25 minutes, which despite a lot of interest from the head of the philosophy department, didn't go as well as I'd hoped for the rest of the audience. The problem was that I tried to close too many inferential gaps in too many areas in too short a timespan, so for this I thought I should take one rationality idea and go into detail. The problem is I don't know which one to choose for maximum impact. I've decided against cryonics because I don't feel confident that I know enough about it.
So what do you think I should talk about for maximum sanity-waterline-raising impact?
Illusion of transparency?
Illusion of transparency is thinking that contents of MIND1 and MIND2 must be similar, ignoring that MIND2 does not have information that strongly influences how MIND1 thinks.
Expecting short inferential distances is underestimating the vertical complexity (information that requires knowledge of other information) of a MAP.
EDIT: I don't know if there is a standard name for this, and it would not surprise me if there isn't. Seems to me that most biases are about how minds work and communicate, while "inferential distances" is about maps that did not exist in ancient environment.