If lukeprog does not use the word rationality in a post, is it rational? I suppose that depends on what you mean when you say rationality. If by rationality do you mean a lukeprog post that would, if read aloud, produce acoustic vibrations in the air that an intelligent agent would concede as rational, or do you mean the brainstate invoked in an agent while it is deliberately maximizing expected utility?
I really like Spencer Greenberg's material. A+ stuff, for sure. No surprise, though, since he's always linking to Less Wrong and has clear, admitted influence from Eliezer Yudkowsky.
A clear exposition of the same material is an excellent thing - especially if it's less distressing to newcomers.
Thank you for pointing that out, it would have been better if I had spoken more carefully. I definitely don't think that uncertainty is in the territory. Please interpret "there is great uncertainty in X" as "our models of X produce very uncertain predictions."
Video link.
This kind of material is regularly featured on Spencer's blog, too.