I sympathize with the statement, which you may or may not have implied, that that world would look a lot like our world. But maybe we should make the question more concrete. What benefits do people honestly expect from LW rationality? Are they actually getting those benefits?
Hard to say. My life would look completely different. I was honestly, for the most part, much happier before getting involved, but I'm certainly more effective now, to the point of not really occupying the same reference class in any useful sense.
LW doesn't seem to have a discussion of the article Epiphany Addiction, by Chris at succeedsocially. First paragraph:
I like that article because it describes a dangerous failure mode of smart people. One example was the self-help blog of Phillip Eby (pjeby), where each new post seemed to bring new amazing insights, and after a while you became jaded. An even better, though controversial, example could be Eliezer's Sequences, if you view them as a series of epiphanies about AI research that didn't lead to much tangible progress. (Please don't make that statement the sole focus of discussion!)
The underlying problem seems to be that people get a rush of power from neat-sounding realizations, and mistake that feeling for actual power. I don't know any good remedy for that, but being aware of the problem could help.