Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
I have never encountered things like Newcomb's problem before LW. And after years on this site, I still don't understand their relevance, or why the more AI x-risk people here obsess over them. Such issues have very little practical value and are extremely far removed from applied rationality.
I agree with Lumifer. It's hard to look at LW and not come away with a bad aftertaste of ivory tower philosophizing in the pejorative sense.
Doesn't that bother you?
If the goal of applied rationalists is to improve upon and teach applied rationality to others, wouldn't it behoove us to reframe the way we speak here and think about how our words can be interpreted in more elegant ways?
It doesn't matter how good of an idea somebody has, if they can't communicate it palatably, it won't reliably pass on in time, not to other people, not to the next generation, nobody.