MugaSofer comments on Rationality Quotes January 2013 - Less Wrong

6 Post author: katydee 02 January 2013 05:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (604)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kawoomba 14 January 2013 12:46:32PM 2 points [-]

Penn Jilette is wrong to call someone not following a god's demands an atheist. Theism is defined by existence claims regarding gods (whether personal or more broadly defined), as a classifier it does not hinge on following said gods' mandates.

Comment author: MugaSofer 14 January 2013 02:20:38PM -2 points [-]

Although it seems like an overly-broad definition of "atheist", I think that the quote is only intended to apply to belief in the monotheistic Supreme Being, not polytheistic small-g-gods.

Comment author: Kawoomba 14 January 2013 04:07:00PM 1 point [-]

My comment applies just the same, whether you spell god God, G_d, GOD or in some other manner: You can believe such a being exists (making you a theist) without following its moral codex or whatever commands it levies on you. Doesn't make you an atheist.

Comment author: BerryPick6 15 January 2013 06:11:30PM *  2 points [-]

Although, if you believe it always tells the truth, then you should follow whatever counterintuitive claim it makes about your own preferences and values, no? So if God were to tell you that sacrificing your son is what CEV_(Kawoomba) would do, would you do it?

Comment author: Kawoomba 15 January 2013 06:30:33PM 2 points [-]

I have a certain probability I ascribe to the belief that god always tells the truth, let's say this is very high.

I also have a certain probability with which I believe that CEV_(Kawoomba) contained such a command. This is negligible because (from the definition) it certainly doesn't fit with "were more the [man] [I] wished [I] were".

However, we can lay that argument (evening out between a high and a very low probability) aside, there's a more important one:

The point is that my values are not CEV_(Kawoomba), which is a concept that may make sense for an AI to feed with, or even to personally aspire to, but is not self-evidently a concept we should unequivocally aspire to. In a conflict between my values and some "optimized" (in whatever way) values that I do not currently have but that may be based on my current values, guess which ones win out? (My current ones.)

That aside, there is no way that the very foundation of my values could be turned topsy turvy and still fit with CEV's mandate of "being the person I want to be".