They also aren't that well-aligned either: they fail in numerous basic ways which are not due to unintelligence. My usual example: non-rhyming poems. Every week for the past year or so I have tested ChatGPT with the simple straightforward unambiguous prompt: "write a non-rhyming poem". Rhyming is not a hard concept, and non-rhyming is even easier, and there are probably at least hundreds of thousands, if not millions, of non-rhyming poems in its training data; ChatGPT knows, however imperfectly, what rhyming and non-rhyming is, as you can verify by asking ...
I think this article wonderfully illustrates the primary relationship between emotions and epistemic rationality. Namely, that emotions can be downstream of false beliefs. Robin Hanson added in another comment that this relationship can go the other direction, when strong emotions bias us in ways that make us less epistemically rational.
But I think there is also a separate relationship between emotions and instrumental rationality. Namely, that emotions can influence which decisions you make. This includes but is not limited to epistemic bias.
Looks like this is actually a link to their second talk on May 4th 2009. Not sure if the first talk still exists.