Jiro comments on Effective altruism and political power - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (50)
Actual human beings' goals don't divide neatly into instrumental and terminal, and actual humans can be inconsistent. So you can have someone who has instrumental goals (that can be changed with evidence showing that they don't meet a terminal goal), terminal goals (which cannot), and inbetween goals like nuclear power that are harder to change than the former category, but easier to change than the latter.
Yep, this is kinda one of the things LW specializes in--helping people become better at changing their minds regarding things they are stubbornly wrong about.
I agree that human beings' goals don't neatly divide in to instrumental and terminal. This is just a model we use. I think Lumifer is using the model in a way that's harmful--labeling stubborn incorrect beliefs as "terminal goals" amounts to throwing up your hands and saying it's impossible to help people become better at changing their minds. Based on the what I've seen, this isn't the case--although it's difficult, it is possible to help people become better at changing their minds, and accomplishing this is highly valuable.