I interpreted "retrain myself to perform only those steps over the course of 30 seconds" to mean that after training for n seconds/minutes/hours, he could solve an equivalent problem in 30 seconds (via the distilled steps). You seem to interpret it to mean that the training takes 30 seconds, and the length of time to solve the problem after training is unspecified.
I don't know which it is, the wording seems ambiguous.
To chime in with a stronger example the cynical audience member from 1.4 could've used: religion. Religions are constantly morphing and evolving, and they've ranged in practices from live sacrifice (human and non-human) to sweeping while walking to avoid potentially hurting a bug. That sorta falls under moral philosophy but I think all the other, non-moral, aspects of religion make the point more strongly. There's no way to determine that religion X is true and Y is false, there's no grounding in reality, and given that the strongest predictor of an individuals religion is which one they were born into, their discernment doesn't seem to be pulling much weight.
Now, you could say religion is a useful idea in terms of social cohesion, but I think that if ai convinces us of happy lies that's not a great outcome (better than many possible ones though)