randallsquared comments on 'Is' and 'Ought' and Rationality - Less Wrong

2 Post author: BobTheBob 05 July 2011 03:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread. Show more comments above.

Comment author: randallsquared 08 July 2011 03:42:44PM 0 points [-]

Knowledge requires justificaiton. A TLGH that understands epssiemology wouild see itslef as not knowing its TLG, since "it was hardwired into me" is no justification. This applies to humans: we are capable of dounting that our evolutionarily derived moral attitudes arre the correct ones.

This only applies to humans because we are not TLGHs. Beliefs and goals require justification because we might change them. Beliefs and goals which are hardwired do not require justification; they must be taken as given. As far as I'm aware, humans only ever have beliefs or goals that seem hardwired in this sense in the case of damage, like people with Capgras delusion.

However, that would be subject to the Open Question objection: we can ask of our inherited morality whether it is actually right. (Unrelatedly, we are probably not determined to follow it, since we can overcome strong evolutionary imperatives in, for instance, voluntary celibacy).

In fact, I would argue that we can only genuinely ask if our "inherited morality" is right because we are not determined to follow it.

Comment author: Peterdjones 08 July 2011 04:02:09PM 1 point [-]

This only applies to humans because we are not TLGHs. Beliefs and goals require justification because we might change them.

I said knowledge requires justification. I was appealing to the standard True Justified Belief theory of knowledge. That belief per se does not need justification is not relevant.

Comment author: randallsquared 08 July 2011 04:27:36PM 0 points [-]

So,

A TLGH that understands epistemology wouild see itself as not knowing its TLG, since "it was hardwired into me" is no justification.

So, it's no justification in this technical sense, and it might cheerfully agree that it doesn't "know" its TLG in this sense, but that's completely aside from the 100% certainty with which it holds it, a certainty which can be utterly unshakable by reason or argument.

I misunderstood what you were saying due to "justification" being a technical term, here. :)