loqi comments on Rationalists lose when others choose - Less Wrong

-10 Post author: PhilGoetz 16 June 2009 05:50PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread. Show more comments above.

Comment author: loqi 17 June 2009 06:35:26PM 1 point [-]

I still think what you're saying is contradictory. We're using "rationality" to mean "maximizing expected utility", correct? If we are aware that certain classes of attempts to do so will be punished, then we're aware that they will not in fact maximize our expected utility, so by definition such attempts aren't rational.

It seems like you're picking and choosing which counterfactuals "count" and which ones don't. How does punishment differ from any other constraint? If I inhabited a universe in which I had an infinite amount of time and space with which to compute my decisions, I'd implement AIXI and call it good. The universe I actually inhabit requires me to sacrifice that particular form of optimality, but that doesn't mean it's irrational to make theoretically sub-optimal decisions.