If you've recently joined the Less Wrong community, please leave a comment here and introduce yourself. We'd love to know who you are, what you're doing, what you value, how you came to identify as a rationalist or how you found us. You can skip right to that if you like; the rest of this post consists of a few things you might find helpful. More can be found at the FAQ.
A few notes about the site mechanics
A few notes about the community
If English is not your first language, don't let that make you afraid to post or comment. You can get English help on Discussion- or Main-level posts by sending a PM to one of the following users (use the "send message" link on the upper right of their user page). Either put the text of the post in the PM, or just say that you'd like English help and you'll get a response with an email address.
* Normal_Anomaly
* Randaly
* shokwave
* Barry Cotter
A note for theists: you will find the Less Wrong community to be predominantly atheist, though not completely so, and most of us are genuinely respectful of religious people who keep the usual community norms. It's worth saying that we might think religion is off-topic in some places where you think it's on-topic, so be thoughtful about where and how you start explicitly talking about it; some of us are happy to talk about religion, some of us aren't interested. Bear in mind that many of us really, truly have given full consideration to theistic claims and found them to be false, so starting with the most common arguments is pretty likely just to annoy people. Anyhow, it's absolutely OK to mention that you're religious in your welcome post and to invite a discussion there.
A list of some posts that are pretty awesome
I recommend the major sequences to everybody, but I realize how daunting they look at first. So for purposes of immediate gratification, the following posts are particularly interesting/illuminating/provocative and don't require any previous reading:
- Your Intuitions are Not Magic
- The Apologist and the Revolutionary
- How to Convince Me that 2 + 2 = 3
- Lawful Uncertainty
- The Planning Fallacy
- Scope Insensitivity
- The Allais Paradox (with two followups)
- We Change Our Minds Less Often Than We Think
- The Least Convenient Possible World
- The Third Alternative
- The Domain of Your Utility Function
- Newcomb's Problem and Regret of Rationality
- The True Prisoner's Dilemma
- The Tragedy of Group Selectionism
- Policy Debates Should Not Appear One-Sided
- That Alien Message
More suggestions are welcome! Or just check out the top-rated posts from the history of Less Wrong. Most posts at +50 or more are well worth your time.
Welcome to Less Wrong, and we look forward to hearing from you throughout the site.
We are not discussing what to do "in general", or the algorithms of a general "I" that should or shouldn't have the property of behaving a certain way in certain problems, we are discussing what should be done in this particular problem, where we might as well assume that there is no other possible problem, and all utility in the world only comes from this one instance of this problem. The focus is on this problem only, and no role is played by the uncertainty about which problem we are solving, or by the possibility that there might be other problems. If you additionally want to avoid logical impossibility introduced by some of the possible decisions, permit a very low probability that either of the relevant outcomes can occur anyway.
If you allow yourself to consider alternative situations, or other applications of the same decision algorithm, you are solving a different problem, a problem that involves tradeoffs between these situations. You need to be clear on which problem you are considering, whether it's a single isolated problem, as is usual for thought experiments, or a bigger problem. If it's a bigger problem, that needs to be prominently stipulated somewhere, or people will assume that it's otherwise and you'll talk past each other.
It seems as if you currently believe that the correct solution for isolated Transparent Newcomb's is one-boxing, but the correct solution in the context of the possibility of other problems is two-boxing. Is it so? (You seem to understand "I'm in Transparent Newcomb's problem" incorrectly, which further motivates fighting the hypothetical, suggesting that for the general player that has other problems on its plate two-boxing is better, which is not so, but it's a separate issue, so let's settle the problem statement first.)
Yes.
I don't think that the most advantageous solution for isolated Transparent Newcomb's is likely to be a very useful question though.
I don't think it's possible to have a general case decision theory which gets the best possible results for every situation (see the Andy and Sandy example, where getting good results for one prisoner's dile... (read more)