The most important benefit from less wrong ist that before lw I hat a very fixed mindset of things I know and I don't, like if it were properties of the things in itself, and when I wanted to improve at something I just do it in a very vague directionless way.
A more concrete example is that I always liked modding video games but in modding is very limited what you can do comparing to coding, so at least once a year I make a half hearted attempt to learn get better at modding, which result in nothing because the next step always was to learn to code (which was in the "I can“t" bin ). After reading posts here of people doing awesome stuff , internalize that the map is not the territory and so, I realized that I could likely learn to code , an then the "I can't" bin broke. Exactly two years later know I'm fairly good with python , java and some of haskell just for the fun. I'm currently close to releasing an android game.
A life changing benefit I gain was to "cure" my social anxiety, it was mostly thanks to a post make here linking to Mark Manson, but it totally changed the way I interact with people from being all fear and uneasiness to flow and actually enjoying being around people (especially women).
Other less direct benefits are clearing a lot of philosophical confusion, save me from a couple of death spirals, I have the memorization problem mostly solved with spaced repetition, I change my mind more often, strategic thinking, meta-thinking and more stuff that's getting more abstract and I don't think is in the spirit of the question.
To answer the question, I DO think that my past self was dumber than me now, so in a way I'm gotten smarter.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
The one improvement that I'm fairly certain I can contribute to lesswrong/HPMOR/etc is getting better at morality. First, being introduced to and convinced up utilitarianism helped me get a grip on how to reason about ethics. Realizing that morality and "what I want the world to be like, when I'm at my best" are really similar, possibly the same thing, was also helpful. (And from there, HPMOR's slytherins and the parts of objectivism that EAs tend to like were the last couple ideas I needed to learn how to have actual self esteem.)
But as to the kinds of improvements you're interested in. I'm better at thinking strategically, often just from using some estimation in decision making. (If I built this product, how many people would I have to sell it to at what price to make it worth my time? Often results in not building the thing.) But the time since I discovered lesswrong included my last two years of college and listening to startup podcasts to cope with a boring internship, so it's hard to attribute credit.
My memory isn't better, but I haven't gone out of my way to improve it. I'm pretty sure that programming and reading about programming are much better ways of improving at programming, than reading about rationality is. The sanity waterline is already pretty high in programming, so practicing and following best practices is more efficient than trying to work them out yourself from first principles.
It didn't surprise me at all to see that someone had made a post asking this question. The sequences are a bit over-hyped, in that they suggest that rationality might make the reader a super-human and then it usually doesn't happen. I think I still got a lot of useful brain-tools from them, though. It's like a videogame that was advertiesd as the game to end all games, and then it turns out to just be a very good game with a decent chance of becoming a classic. (For the record, my expectations didn't go quite that high, that I can remember, but it's not surprising that some peoples' do. It's possible mine did and I just take disappointment really well.)
right, that's what motivated the post. I feel like spending time learning "domain specific knowledge" is much more effective than "general rationality techniques". like even if you want to get better at three totally different things over the course of a few years, the time spent on the general technique (that could help all three) might not help as much as on exclusively specific techniques.
still, I tend to have faith in abstractions/generality, as my mind has good long-term memory and bad short-term memory. I guess this is... a crisis of faith, if you will. in "recursive personal cognitive enhancement" (lol).