You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Perplexed comments on John Baez Interviews with Eliezer (Parts 2 and 3) - Less Wrong Discussion

7 Post author: multifoliaterose 29 March 2011 05:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (34)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 29 March 2011 08:28:04PM *  1 point [-]

Not at all. That essay simply says that non-deterministic algorithms don't perform better than deterministic ones (for some meanings of 'non-deterministic algorithms'). But the claim that needs to be explained is how determinism helps to prevent "making truly spectacular mistakes".

Comment author: timtyler 29 March 2011 09:16:49PM *  1 point [-]

Right. No doubt he is thinking he doesn't want a cosmic ray hitting his friendly algorithm, and turning it into an unfriendly one. That means robustness - or error detection and correction. Determinism seems to be a reasonable approach to this which makes proving things about the results about as easy as possible.