Comment author: thomblake 27 June 2012 07:25:04PM *  1 point [-]

that's all that needs to be said

Would that this were true.

Indeed, if that were all there was to it, nothing would need to be said at all, as that's a tautology. But people manage to fail at noticing when things do / don't work anyway, and false ideas stick around a very long time.

Comment author: aceofspades 05 July 2012 06:31:46PM 0 points [-]

I just find it very unlikely that the specifics of how this post is constructed have much of an effect on correcting this issue.

Comment author: nshepperd 02 July 2012 06:15:57AM 2 points [-]

The world would be a very different place if, say, Thor existed and took a strong interest in the affairs of the human world.

Comment author: aceofspades 05 July 2012 06:29:37PM *  0 points [-]

If his interest resulted in actions that would provide evidence of his existence, then yes. Also, if libertarian free will existed then the world would be an even more different place.

Comment author: aceofspades 02 July 2012 05:02:52AM -2 points [-]

Arguing about the existence of a god is like arguing about free will. The only worthwhile argument concerns differences in anticipated experience, notably things like "Does prayer work?".

Comment author: aceofspades 02 July 2012 04:59:13AM -3 points [-]

I am curious why your posts tend to treat questions like this ("Does free will exist?") as being substantially different from questions like "Does some god exist?"

In response to Reductionism
Comment author: aceofspades 02 July 2012 04:46:06AM 1 point [-]

Does the reductionist model give different predictions about the world than the non-reductionist model? If so, are any easily checked?

In response to What is Evidence?
Comment author: aceofspades 27 June 2012 07:13:14PM *  -3 points [-]

I'm not sure that this terminology about entanglement and such forth actually helps understanding. Reading this post unlikely to cause me to win more bets (make better predictions).

In response to Why truth? And...
Comment author: aceofspades 27 June 2012 07:05:09PM 1 point [-]

I'm not convinced that this post actually says anything. If seeking the truth is useful for any specific reason, then people who see some benefit from it will do so and if it isn't useful then they won't. Actually writing this out has made me think both this post and my comment haven't really said much, but I think that's because this discussion is too abstract to have any real use/meaning. Ideas which are true/work will work, ideas that aren't won't, and that's all that needs to be said, never mind this business about rationality and truth and curiosity.

Comment author: MarkusRamikin 17 May 2012 04:46:56PM *  7 points [-]

I suppose it's that I naively expect, when opening the list of top LW posts ever, to see ones containing the most impressive or clever insights into rationality.

Not that I don't think Holden's post deserves a high score for other reasons. While I am not terribly impressed with his AI-related arguments, the post is of the very highest standards of conduct, of how to have a disagreement that is polite and far beyond what is usually named "constructive".

Comment author: aceofspades 07 June 2012 08:36:37PM 2 points [-]

Some people who upvoted the post may think it is one of the best-written and most important examples of instrumental rationality on this site.

Comment author: aceofspades 04 May 2012 12:39:16PM 4 points [-]

I think this post would benefit from a link to some article about the Iterated Prisoner's Dilemma, since the beginning of this post requires some knowledge about it to be valuable.

Comment author: loqi 26 May 2010 07:32:19AM 29 points [-]

One possible reason Alicorn hasn't applied her technique to you is that it simply isn't powerful enough to overcome your unpleasantness. FWIW, I perceive you as a lot less civil than the LW norm, you seem possessed of a snarky combativeness. You also appear to have a tendency of fixating on personal annoyances and justifying your focus with concerns and observations that pop out of nowhere, context-wise.

In this case, your supposed insight into what would really be best for Alicorn plays that role. And then, having established this "lemma", you carry through to the conclusion that... Alicorn's behavior is inconsistent. Take a step back, and look at what you're saying. You're basically claiming to have reverse-engineered someone else's utility function, as the premise of an argument which concludes that they're being a hypocrite.

I hope you'll come to see this sort of behavior as embarrassing.

Comment author: aceofspades 23 April 2012 05:38:11PM 0 points [-]

"FWIW" == "For What It's Worth," to save a few person-minutes for other passive readers here.

View more: Prev | Next