You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Clarity comments on Survey Article: How do I become a more interesting person? - Less Wrong Discussion

5 Post author: casebash 18 October 2015 10:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (44)

You are viewing a single comment's thread. Show more comments above.

Comment author: gjm 19 October 2015 01:02:58PM 2 points [-]

I don't have any compelling reason to believe a given LW karma giver is any better at predicting the future than not.

Tomorrow, sunrise where I am will be somewhere around 7.30. In the next year, there will not be a nuclear war. The human race will still exist a century from now. If I roll a hundred ordinary 6-sided dice, some of them will come up 1 but not more than 1/3 of them. The best computer I can buy two years from now for $1000 will be faster than the best I can currently buy for $1000, but not as much as twice as fast. Lawrence Lessig will not be the Democratic nominee for the US presidency in 2016. If you take two objects of similar shape, structure and material, one twice the size of the other, and thump both of them, the larger one will make a lower-pitched sound. Within the next month there will be articles in major UK newspapers saying unflattering things about both David Cameron and Jeremy Corbyn, and articles saying flattering things about both. This time next year, the total value of my pension funds will be between half and double what it is now.

I will be very surprised (and you should be, too) if more than one of those predictions is wrong. None of them is trivial. None of them was difficult to make. I am sure you would have no difficulty making a similar set of predictions with similar accuracy.

Even if you "consider right to be accurately predicting future events" (which is, at best, a controversial definition), LW readers -- and people generally, in fact -- are pretty good at being right.

other factors I believe are predictive of karma

What are those other factors, that predict karma well enough that knowing whether what's said is right yields no further improvement in karma prediction on top of them?

Comment author: Clarity 19 October 2015 02:18:35PM -1 points [-]

tomorrow

very well said but you're missing my point. I wanted to emphasise any given LW user. Although particular LW users are very good at predicting, they do not appear to be the same ones who take the effort to vote, on aggregate

What are those other factors, that predict karma well enough that knowing whether what's said is right yields no further improvement in karma prediction on top of them?

In my model, karma is already overdetermined, and it's not a very good model, but similar factors that describe human behaviours as is cannon on LW go into that model. I may ellaborate in the future but like I said, probably not worth anyone's time and I'd rather not clarify on it myself than do another thing.