The epistemic hygiene is a huge factor for me. It is quite difficult to find a community that is nearly this active while retaining such high standards for content. Most of the online communities I've been involved with had a small pocket of individuals with relatively high standards, but I've never found such a degree of uniform quality as I've found here.
Hello, my name is Kevin. I am a very new Lesswrong joiner. I am by no means as smart as all of you guys, I would say Im your average Joe. I stumbled onto lesswrong while researching the singularity about 7 months ago and i have until this very post simply been reading all te discussions and post. I havent got around to reading all the sequences yet, for reasons im embarresed about. I cant understand half of the stuff I read on this site. So when I do, i devot probably three hours a day at this site because I have to google alot of words, or I will get lost in the conversation. The reason I think lesswrong is awesome is because to me its really fun, I never went to college and I really dont know if i would be welcomed here by the most of you, but I'll take my chances.
Welcome to Less Wrong!
I havent got around to reading all the sequences yet, for reasons im embarresed about.
I would definitely recommend reading the sequences, since they're pretty well written (maybe they're consequently more understandable? not sure) and there are a few articles in them that are on the high-tail end of fun.
I never went to college and I really dont know if i would be welcomed here by the most of you, but I'll take my chances.
There are more people than you'd think here in similar situations.
I quickly lost motivation to keep writing for my own site in part because the comments on my LW posts were of much higher quality than the comments on my posts for my own site.
It's handy to discuss things with people who (mostly) share a large amount of method and background knowledge. You can start conversations at a fairly high level, instead of having to explain over and over again the basics of probability theory or what truth is.
What makes Less Wrong awesome? Its members.
People who believe that a small group is going to take over the universe to save it by making the seed of an artificial general intelligence, that is undergoing explosive recursive self-improvement, extrapolate the coherent volition of humanity, while acausally trading with other superhuman intelligences across the multiverse.
Do you really need more awesomeness?! Don't tell your doctor!
Well, Less Wrong is awesome if only for statements like this:
I bet there's at least one up-arrow-sized hypergalactic civilization folded into a halting Turing machine with 15 states, or something like that. [...] It might perhaps be more limited than this in mere practice, if it's just running on a laptop computer or something.
And that only scratches the surface! There are...
Once you grasp the full scope of Less Wrong, statements that would otherwise seem extraordinary begin to pale in comparison:
Whoever knowingly chooses to save one life, when they could have saved two - to say nothing of a thousand lives, or a world - they have damned themselves as thoroughly as any murderer.
What makes Less Wrong awesome is that it shows how the most extraordinary beliefs are actually hold by atheist rationalists:
If you don't sign up your kids for cryonics then you are a lousy parent.
Nowhere but here can you find similar ideas:
According to the article, the AGI was almost completed, and the main reason his effort failed was that the company ran out of money due to the bursting of the bubble. Together with the anthropic principle, this seems to imply that Ben is the person responsible for the stock market crash of 2000.
or
I previously suggested that we discount each individual using something like the length of its address in the multiverse.
or
For example, you can convince everyone that quantum immortality works by killing them along with yourself. (This shouldn't pose any risk if you've already convinced yourself :-) Paul Almond has proposed that this can solve the Fermi paradox: we don't see alien civilizations because they have learned to solve complex computational problems by civilization-level quantum suicide, and thus disappeared from our view.
Well, Less Wrong is awesome if only for statements like this:
I bet there's at least one up-arrow-sized hypergalactic civilization folded into a halting Turing machine with 15 states, or something like that. [...]
Surely that loses points for speculating about what we already know. A simple counter would produce a bitmap of this universe's space-time matrix after a little while.
Because of Less Wrong, I am signed up for cryonics. In particular Eliezer's writings informed me that cryonics is available and affordable, from organizations that take it seriously.
Oh! Does it change anything you do or like your how you feel? I've heard sometimes people report feeling some of their (presumably non-useful?) worries about death slip away.
For me, it hasn't had much effect on how I feel, though I didn't tend to have strong emotional reactions to the concept that I would die at some point in the far future. But that really is not the point. My purpose for cryonics is not to feel like I'm not going to die, but to actually not die.
The original reason I'm here is because Eliezer is an engaging essayist. I started reading the sequences and didn't stop until I was finished.
I've since noticed that I'm disappointed in other internet forums; they seem now muddled and sluggish-thinking by comparison. I don't know if I've become spoiled, or if I've justifiedly altered my rationality set-point, or if I'm becoming a narrow-minded crank. But LW seems to me to have that magic balance of intelligent people saying intelligent things, together with the ideas being presented in a clear, accessible fashion.
The atmosphere of cooperative truth-seeking, where it feels like ideas are taken seriously, there is a genuine effort to understand what others are saying, and frequent engagement with the least convenient world.
Also, awesome users like Eneasz for the HPMoR podcast, AdeleneDawner and PeerInfinity for the actually useful horoscopes, Vladimir Nesov and others for maintaining the wiki, Alicorn for her pony illustrations, jb55 and OneWhoFrogs for ebook versions of the sequences, BrandonReinhart for the examination of SIAI finances, and of course anyone who posts regularly. Since many of these kudos are for recent projects, I've sure I've left a lot out.
One thing that makes Less Wrong awesome for me is that it's helped me seriously reduce my politics habit. I have a lot of posts on the xkcd forums, and politics (and economics) is a major recurring theme of those posts. Once I started thinking strategically about my involvement in politics, I found the following question useful: what is the optimal level of effort to put into politics (i.e. where marginal benefit equals marginal cost)?
The result, of course, is that I should put almost zero effort into politics. I still focus heavily on economics, human psychology, and systems design because those are professionally relevant for me, but I try to avoid discussions that are essentially debates on identities, not math or facts.
I've noticed a number of benefits from this- I have more available time, I have happier hobbies (when I do get into political discussions these days, I will notice the physiological residue of anger, which I generally try to avoid), I don't waste money on political donations.
Offtopic: After glancing over the post title, I misread your nickname as Will_Awesome :)
On-Topic: What's awesome about LW for me? Thanks to LW, I now enjoy long stretches of very productive time, with little to no akrasia. So, my answer is Instrumental Rationality.
(I also suspect that LW has subconsciously improved my epistemic rationality as well, as I keep noticing nice little benefits here and there, but I've yet to bring this to the conscious level.)
I like Less Wrong because it is an easily accessible, rich source of original content on matters of interest to me - ethics, morality, rational decision making and general philosophy. Let me elaborate:
Easily accessible - most of Less Wrong is on a level which for me strikes a a sweet spot of being easy to understand and interesting
Rich - The topics here are valuable and manifold
Original content - importantly, the material on the site is for most part original writing (cf Hacker News)
I'd recommend striving to develop the "rich, original content" by cherishing dissent and controversy, provided these are well presented.
True creativity happens on the border of order and chaos (Italian proverb).
Where else on the internet are people willing to change their minds?
Many scientists are willing to change their minds. Even normal people change their minds often. People become atheists or start voting for a different party. How many members here can you actually name who changed their mind about something dramatic?
Someone who is rather cynical about Less Wrong could go a step further and conclude that Less Wrong appears to be about changing your mind, but that it mainly attracts people who already tend to agree with ideas put forth on Less Wrong, who take ideas seriously. Everyone else turns his back on it or gets filtered out quickly. And those that already agree are not going to change their mind again, because they are not entitled to the particular proof necessary to change their mind, as most of the controversial ideas are either framed as a prediction or logical implication that is not subject to empirical criticism. What is left over is too vague or unsubstantial to change your mind about it one way or the other.
Someone even more cynical might say that lesswrong only departs from mainstream skeptical scientific consensus in ways that coincidentally line up exactly with the views of eliezer yudkowsky, and that it's basically an echo chamber.
That said, rational thinking is a great ideal, and I think it's awesome that lesswrong even TRIES to live up to it.
When I discovered less wrong, there were things I disagreed with. There are actually still things discussed here that I disagree with the apparent consensus on. But I've changed my mind on a large number of things. When I joined less wrong, my understanding of cryonics was that it was a scam for new-agers. I had heard of such concepts as transhumanism and singularitarianism, but had no exposure to individuals who actually held such beliefs. After reading a few of the sequences, I went to EY's website, and found this. I finished that article, thought about it for approximately a minute, and said "Yep. That makes sense." Fast forward one week, and I'm persuading other people to sign up for cryonics. That was a pretty dramatic shift for me.
it mainly attracts people who already tend to agree with ideas put forth on Less Wrong
Hm, in my case I have to say that reading LessWrong changed almost all my beliefs: Roughly 9 months ago I was a socialist, an anti-reductionist, an agnostic leaning towards deism, new-age-minded guy who loved psychedlic drugs and marijuana. I was proud of my existential angst and read like-minded philosophy and literature. I had no idea of transhumanism, the singularity, existential risks or the FAI-problem.
I don't believe that I'm the only one, who changed his mind after reading the sequences, I'm not that special!
That's interesting, I didn't expect that since I thought that most people who could benefit a lot from LW are most likely not going to read it or understand it. But maybe I am wrong, I seldom encounter such stories as yours.
But that you went all the way from new-age to the Singualrity and FAI troubles me a bit. Not that it isn't better than new-age stuff, but can you tell me what exactly convinced you of risks from AI?
Well, to be clear, I didn't believe in homeopathy or astrology or other obviously false crackpot theories. In fact, some of my heroes were skeptics like Bertrand Russell and Richard Dawkins. But I also believed in some objective, transcendental morality stuff, and I tried to combine some mystic,mysterious quantum physics interpretations with Buddhist philosophy ( you know the Atman is the Brahman, etc..) . Just like Schrödinger, Bohm and so on. And I ( wanted to) belief in the sort of free will proposed by Kant. I didn't understand what I was thinking and I had the feeling that something was wrong or inconsistent with my beliefs. When I was younger I were much more confident in materialism and atheism, but some drug-experiences disturbed me and I began to question my worldviews. Anyway, let's say I believed in enlightened, deeply wise-sounding, new-age-gibberish. I know, I know, It's embarrassing, but hopefully not that embarrassing.
Well, some essays of Bostrom and mainly the sequences convinced me of the risks of AI. I'm not as sure about it as e.g. Yudkowsky ( In fact I probably think that it is more likely than not, that his scenario is false) but, if we assign a 25% probability to the Yudkowskian AI-Foom-scenario it still seems absurdly important, right? And Yudkowsky makes more sense to me than Hanson or Goertzel, and folks like Kurzweil, and especially De Garis seem to be off base.
I am just starting out here, but I feel as if I'm about to change my mind in the same way you did. I was interested in Utopia (ending suffering) and that got me pulled into Buddhism and all the other paraspychological weirdness.
I'm fairly enthusiastic about LW, but I think that
it mainly attracts people who already tend to agree with ideas put forth on Less Wrong, who take ideas seriously. Everyone else turns his back on it or gets filtered out quickly.
has a big effect.
The idea and practice of instrumental rationality is the most important thing I've gained from the site. I used to be the typical smart, atheistic contrarian, who didn't see any point in trying particularly hard socially; now I'm consciously pursuing many promising avenues for improving my life and social skills, to aid in attaining my goals.
I've read for a while now, because LW is interesting. But I recently had a little bit of one-on-one interaction with AnnaSalamon, lukeprog and Carl_Shulman, and they're also just great people to interact with personally.
It seems that at least some of the people here are just plain really great people (in addition to being interesting).
I've been surprised at just how easy it is to find people who know so much more than you (because they've been here longer) and doubly surprised because they are willing to interact with you one on one.
Recently I asked "What bothers you about Less Wrong?". It might be worth going back and checking out what people had to say, to see if there's something you can do to make Less Wrong more fun for everyone. (A few people made cool posts in response to complaints about lack of technical discussion, for instance.)
Let's hear the other side. What is cool about Less Wrong? What drew you in, what makes you stay, what makes you obsessively read every comment of every post? Is they're something we're doing right that we should be doing more? Bonus points for pointing out how we can make our awesome traits even more awesome, or how to make our awesomeness more obvious to outside folk who'd appreciate it. Whatever it is, add it to the comments.