BerryPick6 comments on A reply to Mark Linsenmayer about philosophy - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (68)
Science can't tell us anything about how to be more rational? Is that your claim?
After breaking down the equation for how one gets to an 'ought' statement, I think it's obvious how science can help us inform our 'oughts'. You seem to agree, more or less, with my assessment of the calculation necessary for reaching 'ought' statements, and since science can tell us things about each of the individual parts of the calculation, it follows that it can tell us things about the sum as well.
Hmm... After thinking about it, it seems more likely that rationality belongs to the 'is' box, and reflectiveness/informativeness belong in the the 'desire/goal' box. Duly noted.
I'm not sure I understand what you are objecting to.
I think the claim is that science can't tell us how to become "perfectly rational". Science can certainly tell us how to become "more rational", but only if we already have a specification of what "more rational" is, and just need to figure out how to implement it. I think most of us who are trying to figure out such specifications do not see our work as following the methods of science, but rather more like doing philosophy.
I was responding to you claim:
"perfectly informed and perfectly rational"
You have shifted the ground from "perfect" to "better".
That's because you are still thinking of an "ought* as an instrumental rule for realising personal values, but in the context of the is-ought divide, it isn't that, it is ethical. You still haven't understood what the issue is about. There are cirrcumstances under which I ought not to do what I desire to do.
The better it gets, the closer it gets to perfect. Eventually, if science can tell us enough about rationality, there's no reason we can't understand the best form of it.
I'm a Moral Anti-Realist (probably something close to a PMR, a la Luke) so the is-ought problem reduces to either what you've been calling 'instrumental meaning' or to what I'll call 'terminal meaning', as in terminal values.
There's nothing more to it. If you think there is, prove it. I'm going with Mackie on this one.
Yes, like I've said. When your beliefs about the world are wrong, or your beliefs about how best to achieve your desires are wrong, or your beliefs about your values are misinformed or unreflective, then the resulting 'ought' will be wrong.