Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Caledonian2 comments on Something to Protect - Less Wrong

52 Post author: Eliezer_Yudkowsky 30 January 2008 05:52PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (76)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Caledonian2 31 January 2008 02:54:30PM 0 points [-]

A rationalistic moral relativist might say that actions require goals, ultimate goals are arbitrary, and so rationality cannot be the starting point there.

Lots of things act without having any sort of goals. Does fire have a goal of reducing high-energy compounds into oxidized components and free energy? No, but it does it anyway.

You can limit 'action' to intentional events only, I suppose.

However, how does declaring that goals are arbitrary rule out assertions about necessary starting points?

So it could be countered that 'rationality' never has to supply everything; its purpose will largely be to critique existing purposes, order them by significance, or evaluate new possibilities.

If the goals already developed are incompatible with each other, rationality isn't going to help much. If they're incompatible with rationality, it really isn't going to help. But no helping is possible.

Say something more about what you think the role of rationality should be in developing a morality, and about the particular powers it has to fulfil that role.

Rationality is required to form a coherent model (however incomplete or imperfect) of the world. To take an action with the intention of bringing about a specific result requires a coherent model. Ergo...

An incoherent actor can't be said to have any goals at all.