A few notes about the site mechanics
A few notes about the community
If English is not your first language, don't let that make you afraid to post or comment. You can get English help on Discussion- or Main-level posts by sending a PM to one of the following users (use the "send message" link on the upper right of their user page). Either put the text of the post in the PM, or just say that you'd like English help and you'll get a response with an email address.
* Normal_Anomaly
* Randaly
* shokwave
* Barry Cotter
A note for theists: you will find the Less Wrong community to be predominantly atheist, though not completely so, and most of us are genuinely respectful of religious people who keep the usual community norms. It's worth saying that we might think religion is off-topic in some places where you think it's on-topic, so be thoughtful about where and how you start explicitly talking about it; some of us are happy to talk about religion, some of us aren't interested. Bear in mind that many of us really, truly have given full consideration to theistic claims and found them to be false, so starting with the most common arguments is pretty likely just to annoy people. Anyhow, it's absolutely OK to mention that you're religious in your welcome post and to invite a discussion there.
A list of some posts that are pretty awesome
I recommend the major sequences to everybody, but I realize how daunting they look at first. So for purposes of immediate gratification, the following posts are particularly interesting/illuminating/provocative and don't require any previous reading:
- The Worst Argument in the World
- That Alien Message
- How to Convince Me that 2 + 2 = 3
- Lawful Uncertainty
- Your Intuitions are Not Magic
- The Planning Fallacy
- The Apologist and the Revolutionary
- Scope Insensitivity
- The Allais Paradox (with two followups)
- We Change Our Minds Less Often Than We Think
- The Least Convenient Possible World
- The Third Alternative
- The Domain of Your Utility Function
- Newcomb's Problem and Regret of Rationality
- The True Prisoner's Dilemma
- The Tragedy of Group Selectionism
- Policy Debates Should Not Appear One-Sided
More suggestions are welcome! Or just check out the top-rated posts from the history of Less Wrong. Most posts at +50 or more are well worth your time.
Welcome to Less Wrong, and we look forward to hearing from you throughout the site!
Once a post gets over 500 comments, the site stops showing them all by default. If this post has 500 comments and you have 20 karma, please do start the next welcome post; a new post is a good perennial way to encourage newcomers and lurkers to introduce themselves. (Step-by-step, foolproof instructions here; takes <180seconds.)
If there's anything I should add or update on this post (especially broken links), please send me a private message—I may not notice a comment on the post.
Finally, a big thank you to everyone that helped write this post via its predecessors!
Hi. Just leaving a few comments about me and what I have been doing in terms of research people here will find interesting. I joined just a couple of days ago so I am not so sure about styles, this seems to be the proper place for a first post and I am guessing the format and contents are free.
While I was once a normal theoretical physicist, I was always interested in the questions of why we believe in some theories, I think that for a while I felt that we were not doing everything right. As I went through my professional life, I had to start interacting with people from different areas and that meant a need to learn Statistics. Oddly, I taught myself Bayesian methods before I even knew there was something called hypothesis tests.
Today, my research involves parts of Opinion Dynamics (I am still a theoretical physicist there, somehow), and I have been starting to make more and more use of results of human cognition experiments to understand a few things as well as a Bayesian framework to generate my models. I have also been doing some small amount of research in evolutionary models. But my real main interest in the moment can easily be seen in a paper that I just put online at the ArXiv preprint site. Indeed, while I already knew the site and found it interesting, time limits meant I never really planed to write anything here. So, the reason I actually joined the site now is because I think you will find the whole discussion in the paper quite interesting. I do think that my main conclusion there about human reasoning and its consequences is so obvious that it always amazes me how deep our instincts must be for it to have remained hidden.
There is a series of biases and effects that happen when we decide to support an idea. And those biases make us basically unable to change our minds or, in other words, to learn. In the paper I inspect the concept of choosing an idea to support from what we know about rationality. I conduct a small simulation experiment with different models that suggest that our desire to have one only idea is behind extremist points of view, and I finally discuss the consequences of it all for scientific practice. There is a book planned, with many more details and aimed at the layperson, the first draft version is complete, but it will still take a while before the book is out. The article is in drier prose, of course.
Anyway, while I am still submitting it for publication, the preprint is available at
http://arxiv.org/abs/1508.05169
The name of the article is "Thou shalt not take sides: Cognition, Logic and the need for changing how we believe", I do think you people here will have a lot of fun with it.
Best, André
Thanks for the link! Very nice publication!