A few notes about the site mechanics
A few notes about the community
If English is not your first language, don't let that make you afraid to post or comment. You can get English help on Discussion- or Main-level posts by sending a PM to one of the following users (use the "send message" link on the upper right of their user page). Either put the text of the post in the PM, or just say that you'd like English help and you'll get a response with an email address.
* Normal_Anomaly
* Randaly
* shokwave
* Barry Cotter
A note for theists: you will find the Less Wrong community to be predominantly atheist, though not completely so, and most of us are genuinely respectful of religious people who keep the usual community norms. It's worth saying that we might think religion is off-topic in some places where you think it's on-topic, so be thoughtful about where and how you start explicitly talking about it; some of us are happy to talk about religion, some of us aren't interested. Bear in mind that many of us really, truly have given full consideration to theistic claims and found them to be false, so starting with the most common arguments is pretty likely just to annoy people. Anyhow, it's absolutely OK to mention that you're religious in your welcome post and to invite a discussion there.
A list of some posts that are pretty awesome
I recommend the major sequences to everybody, but I realize how daunting they look at first. So for purposes of immediate gratification, the following posts are particularly interesting/illuminating/provocative and don't require any previous reading:
- Your Intuitions are Not Magic
- The Apologist and the Revolutionary
- How to Convince Me that 2 + 2 = 3
- Lawful Uncertainty
- The Planning Fallacy
- Scope Insensitivity
- The Allais Paradox (with two followups)
- We Change Our Minds Less Often Than We Think
- The Least Convenient Possible World
- The Third Alternative
- The Domain of Your Utility Function
- Newcomb's Problem and Regret of Rationality
- The True Prisoner's Dilemma
- The Tragedy of Group Selectionism
- Policy Debates Should Not Appear One-Sided
- That Alien Message
More suggestions are welcome! Or just check out the top-rated posts from the history of Less Wrong. Most posts at +50 or more are well worth your time.
Welcome to Less Wrong, and we look forward to hearing from you throughout the site.
(Note from orthonormal: MBlume and other contributors wrote the original version of this welcome message, and I've stolen heavily from it.)
It's not a rhetorical question, you know. What happens if you try to answer it?
I have a pill in my hand. I'm .99 confident that, if I take it, it will grant me a thousand units of something valuable. (It doesn't matter for our purposes right now what that unit is. We sometimes call it "utilons" around here, just for the sake of convenient reference.) But there's also a .01 chance that it will instead take away ten thousand utilons. What should I do?
It's called reasoning under uncertainty, and humans aren't very good at it naturally. Personally, my instinct is to either say "well, it's almost certain to have a good effect, so I'll take the pill" or "well, it would be really bad if it had a bad effect, so I won't take the pill", and lots of studies show that which of those I say can be influenced by all kinds of things that really have nothing to do with which choice leaves me better off.
One way to approach problems like this is by calculating expected values. Taking the pill gives me a .99 chance of 1000 utilons, and a .01 chance of -10000 utilons; the expected value is therefore .99 1000 - .01 10000 = 990 - 100; the result is positive, so I should take the pill. If I instead estimated a .9 chance of upside and a .1 chance of downside, the EV calculation would be 99 - 1000; negative result, so I shouldn't take the pill.
There are weaknesses to that approach, but it has definite advantages relative to the one that's wired into my brain in a lot of cases.
The same principle applies if I estimate a .99 chance that by adopting the ideology in my hand, I will make better choices, and a .01 chance that adopting that ideology will lead me to do evil things instead.
Of course, what that means is that there's a huge difference between being 99% certain and being 99.99999999% certain. It means that there's a huge difference between being mistaken in a way that kills millions of people, and being mistaken in a way that kills ten people. It means that it's not enough to say "that's good" or "that's evil"; I actually have to do the math, which takes effort. That's an offputting proposition; it's far simpler to stick with my instinctive analysis, even if it's less useful.
At some point, the question becomes whether I feel like making that effort.