Hello. I'm a typical geeky 20-something white male who's interested in science and technology. I'm a Bachelor in economics and business. Not a native English speaker.
From the time I was 12 I've spent most of my time surfing around the internet reading about interesting things and generally wasted my time and being alone. A few years ago I was really depressed and had a plan for suicide. Once in a while I've done something actually useful. That's my life in a nutshell.
I have always thought of myself as somewhat rational in the traditional sense when I'm not emotionally charged, but so do most of the people, I'd say. Who would be intentionally irrational?
When I first heard about LessWrong on 4chan/sci/ a few years ago, I heard only about negative things of it. I got the impression that this is basically some kind of daydreaming cult for people who are interested in the singularity and transhumanism. Like people just write about some things that sound kinda important and deep in a pop-science manner, but don't want to do anything more quantifiable or exact or something that's more difficult, like real science. I got the impression that it's not something you're supposed to take very seriously.
Okay, a few years go by, I start to be more interested in futurology and stuff. I stumble upon Luke Muehlhauser in his reddit AMA and the things he talked about in his AMA sounded kinda cool, something I've never really thought before and I read a few of his papers (Intelligence Explosion and Machine Ethics, Intelligence Explosion: Evidence and Import). After this I forget this thing again for a year until I read his book "Facing the intelligence explosion" in which he goes to lengths to talk about LessWrong so I decide to take a look.
So I read the sequence "How To Actually Change Your Mind" and there where some useful things to consider if I want to be neutral in the face of evidence and change my mind about things. This bayesian approach to rationality or whatever-it's-called sounds pretty reasonable and I think I want to learn more of it. In the meantime I read Eliezer Yudkowsky's HPMOR and "Cognitive Biases Potentially Affecting Judgement of Global Risk" and a few random LessWrong articles here and there. Sometimes Eliezer Yudkowsky sounds so full of himself, like he knows everything about everything, that's it's pretty annoying. His narcissism and self-proclaimed geniosity reminds me of Stephen Wolfram. But I like his optimism, he has really useful ideas to share about rationality and he's good at writing.
I also started to think, that if these people are trying to be so rational then why do so many of them hold seemingly irrational beliefs about some things without much quantifiable evidence. I mean, I have a gut feeling that the singularity will probably happen at some point if there isn't some societal collapse, but it's far from certain and may not happen the same way FAI advocates anticipate. The event is so far in the future and there are so many factors related to it, so I'm not sure how well you can predict how it happens and say meaningful things about it. Someone here made a good remark about it:
Furthermore, experts perform pretty badly when thinking about dynamic stimuli, thinking about behavior, and feedback and objective analysis are unavailable.
Predictions about existential risk reduction and the far future are firmly in the second category. So how can we trust our predictions about our impact on the far future?
I also agree with many of the points raised in this post. I think the work MIRI is doing might be useful and I'm not against it, but I wouldn't personally allocate my resources towards it at this point, at least not money. Karnofsky criticized that MIRI doesn't take into account many variables he has considered, but on top of that there must be even MORE variables MIRI hasn't taken into account.
There are many beliefs here that seem to be based on non-quantifiable hypotheses. You would think that if you took a bunch of rationalists who applied the methods of rationality correctly and were willing to change their minds about their beliefs, the likelihood that they had the same fringe-beliefs based on non-quantifiable evidence would be pretty small. Note: I don't know everything about the community here, this is just from the little time I've spent here.
I hope MIRI, transhumanism, cryonics, polyamory etc. are not inherently connected to LessWrong and its approach to rationality?
I still have a cautiously positive view about this community. Even though I dislike some of these fringe opinions, I'm still interested in decision theory and in this kind of approach to rationality, which I don't think is fringe at all and I'm willing to learn more about it. I'm kinda slow thinker and sometimes it feels when I'm around people that I'm less intelligent than others and it takes longer for me to process things than the people around me. By making good decisions I could minimize the impact of situations where my well-being depends wholly on quick thinking.
But I don't except very much practical success and most of all, I think of this as a form of entertainment ("epiphany porn" as you like to call it) and when I have more important things to do, I will probably set this thing aside.
A few notes about the site mechanics
A few notes about the community
If English is not your first language, don't let that make you afraid to post or comment. You can get English help on Discussion- or Main-level posts by sending a PM to one of the following users (use the "send message" link on the upper right of their user page). Either put the text of the post in the PM, or just say that you'd like English help and you'll get a response with an email address.
* Normal_Anomaly
* Randaly
* shokwave
* Barry Cotter
A note for theists: you will find the Less Wrong community to be predominantly atheist, though not completely so, and most of us are genuinely respectful of religious people who keep the usual community norms. It's worth saying that we might think religion is off-topic in some places where you think it's on-topic, so be thoughtful about where and how you start explicitly talking about it; some of us are happy to talk about religion, some of us aren't interested. Bear in mind that many of us really, truly have given full consideration to theistic claims and found them to be false, so starting with the most common arguments is pretty likely just to annoy people. Anyhow, it's absolutely OK to mention that you're religious in your welcome post and to invite a discussion there.
A list of some posts that are pretty awesome
I recommend the major sequences to everybody, but I realize how daunting they look at first. So for purposes of immediate gratification, the following posts are particularly interesting/illuminating/provocative and don't require any previous reading:
More suggestions are welcome! Or just check out the top-rated posts from the history of Less Wrong. Most posts at +50 or more are well worth your time.
Welcome to Less Wrong, and we look forward to hearing from you throughout the site!
Once a post gets over 500 comments, the site stops showing them all by default. If this post has 500 comments and you have 20 karma, please do start the next welcome post; a new post is a good perennial way to encourage newcomers and lurkers to introduce themselves. (Step-by-step, foolproof instructions here; takes <180seconds.)
If there's anything I should add or update on this post (especially broken links), please send me a private message—I may not notice a comment on the post.
Finally, a big thank you to everyone that helped write this post via its predecessors!