The Most Important Thing You Learned

13 Eliezer_Yudkowsky 27 February 2009 08:15PM

My current plan does still call for me to write a rationality book - at some point, and despite all delays - which means I have to decide what goes in the book, and what doesn't.  Obviously the vast majority of my OB content can't go into the book, because there's so much of it.

So let me ask - what was the one thing you learned from my posts on Overcoming Bias, that stands out as most important in your mind?  If you like, you can also list your numbers 2 and 3, but it will be understood that any upvotes on the comment are just agreeing with the #1, not the others.  If it was striking enough that you remember the exact post where you "got it", include that information.  If you think the most important thing is for me to rewrite a post from Robin Hanson or another contributor, go ahead and say so.  To avoid recency effects, you might want to take a quick glance at this list of all my OB posts before naming anything from just the last month - on the other hand, if you can't remember it even after a year, then it's probably not the most important thing.

Please also distinguish this question from "What was the most frequently useful thing you learned, and how did you use it?" and "What one thing has to go into the book that would (actually) make you buy a copy of that book for someone else you know?"  I'll ask those on Saturday and Sunday.

PS:  Do please think of your answer before you read the others' comments, of course.

About Less Wrong

50 Eliezer_Yudkowsky 23 February 2009 11:30PM

Over the last decades, new experiments have changed science's picture of the way we think - the ways we succeed or fail to obtain the truth, or fulfill our goals. The heuristics and biases program, in cognitive psychology, has exposed dozens of major flaws in human reasoning. Social psychology shows how we succeed or fail in groups. Probability theory and decision theory have given us new mathematical foundations for understanding minds.

Less Wrong is devoted to refining the art of human rationality - the art of thinking. The new math and science deserves to be applied to our daily lives, and heard in our public voices.

Less Wrong consists of three areas: The main community blog, the Less Wrong wiki and the Less Wrong discussion area.

Less Wrong is a partially moderated community blog that allows general authors to contribute posts as well as comments. Users vote posts and comments up and down (with code based on Reddit's open source). "Promoted" posts (appearing on the front page) are chosen by the editors on the basis of substantive new content, clear argument, good writing, popularity, and importance.

We suggest submitting links with a short description. Recommended books should have longer descriptions. Links will not be promoted unless they are truly excellent - the "promoted" posts are intended as a filtered stream for the casual/busy reader.

The Less Wrong discussion area is for topics not yet ready or not suitable for normal top level posts. To post a new discussion, select "Post to: Less Wrong Discussion" from the Create new article page. Comment on discussion posts as you would elsewhere on the site.

Votes on posts are worth ±10 points on the main site and ±1 point in the discussion area. Votes on comments are worth ±1 point. Users with sufficient karma can publish posts. You need 20+ points to post to the main area and 2+ points to post to the discussion area. You can only down vote up to four times your current karma (thus if you never comment, you cannot downvote). Comments voted to -3 or lower will be collapsed by default for most readers (if you log in, you can change this setting in your Preferences). Please keep this in mind before writing long, thoughtful, intelligent responses to trolls: most readers will never see your work, and your effort may be better spent elsewhere, in more visible threads. Similarly, if many of your comments are heavily downvoted, please take the hint and change your approach, or choose a different venue for your comments. (Failure to take the hint may lead to moderators deleting future comments.) Spam comments will be deleted immediately. Off-topic top-level posts may be removed.

We reserve the right for moderators to change contributed posts or comments to fix HTML problems or other misfeatures. Moderators may add or remove tags.

Less Wrong is brought to you by the Future of Humanity Institute at Oxford University. Neither FHI nor Oxford University necessarily endorses any specific views appearing anywhere on Less Wrong. Copyright is retained by each author, but we reserve the non-exclusive right to move, archive, or otherwise reprint posts and comments.

continue reading »

Issues, Bugs, and Requested Features

10 Eliezer_Yudkowsky 26 February 2009 04:45PM

[Edit: IssuesBugs, and Requested Features should be tracked at Google Code, not here -- matt, 2010-04-23

 

Less Wrong is still under construction.  Please post any bugs or issues with Less Wrong to this thread.  Try to keep each comment thread a clean discussion of each bug or issue.

Requested features... sure, go ahead, but bear in mind we may not be able to implement for a while.

Kinnaird's truels

25 Johnicholas 05 March 2009 04:50PM

A "truel" is something like a duel, but among three gunmen. Martin Gardner popularized a puzzle based on this scenario, and there are many variants of the puzzle which mathematicians and game theorists have analyzed.

The optimal strategy varies with the details of the scenario, of course. One take-away from the analyses is that it is often disadvantageous to be very skillful. A very skillful gunman is a high-priority target.

The environment of evolutionary adaptedness undoubtedly contained multiplayer social games. If some of these games had a truel-like structure, they may have rewarded mediocrity. This might be an explanation of psychological phenomena like "fear of success" and "choking under pressure".

Robin Hanson has mentioned that there are costs to "truth-seeking". One of the example costs might be convincingly declaring "I believe in God" in order to be accepted into a religious community. I think truels are a game-theoretic structure that suggests that there are costs to (short-sighted) "winning", just as there are costs to "truth-seeking".

How can you identify truel-like situations? What should you (a rationalist) do if you might be in a truel-like situation?

 

View more: Prev