You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Jiro comments on Effective altruism and political power - Less Wrong Discussion

2 Post author: adamzerner 17 June 2015 05:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (50)

You are viewing a single comment's thread. Show more comments above.

Comment author: Jiro 19 June 2015 05:49:43PM 1 point [-]

Actual human beings' goals don't divide neatly into instrumental and terminal, and actual humans can be inconsistent. So you can have someone who has instrumental goals (that can be changed with evidence showing that they don't meet a terminal goal), terminal goals (which cannot), and inbetween goals like nuclear power that are harder to change than the former category, but easier to change than the latter.

Comment author: hg00 20 June 2015 01:07:27AM 0 points [-]

inbetween goals like nuclear power that are harder to change than the former category, but easier to change than the latter

Yep, this is kinda one of the things LW specializes in--helping people become better at changing their minds regarding things they are stubbornly wrong about.

I agree that human beings' goals don't neatly divide in to instrumental and terminal. This is just a model we use. I think Lumifer is using the model in a way that's harmful--labeling stubborn incorrect beliefs as "terminal goals" amounts to throwing up your hands and saying it's impossible to help people become better at changing their minds. Based on the what I've seen, this isn't the case--although it's difficult, it is possible to help people become better at changing their minds, and accomplishing this is highly valuable.