ChristianKl comments on Earning money with/for work in AI safety - Less Wrong

7 Post author: rmoehn 18 July 2016 05:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (31)

You are viewing a single comment's thread. Show more comments above.

Comment author: Dagon 18 July 2016 03:27:38PM *  0 points [-]

Consider more carefully your ranking of preferences, and expand your horizons quite a bit. There're lots of ways to improve the predicted future state of humanity that are less direct, but possibly more effective, than this particular topic.

I care about the current and future state of humanity

That's sweet of you. I'm glad.

so I think it's good to work on existential or global catastrophic risk

That's a pretty big jump. I'll grant that human existential risk is important, but why is your best contribution to work directly on it? Perhaps you'd do a lot more good with a slight reduction in shipping costs or tiny improvements in safety or enjoyment of some consumer product. In the likely case that your marginal contribution to x-risk doesn't save the world, a small improvement for a large number of people does massive amounts more good.

Regardless of whether you focus on x-risk or something else valuable, the fact that you won't consider leaving Kagoshima is an indication that you aren't as fully committed as you claim. IMO, that's ok: we all have personal desires that we put ahead of the rest of the world. But you should acknowledge it and include it in your calculations.

Comment author: ChristianKl 19 July 2016 03:44:09PM -1 points [-]

Perhaps you'd do a lot more good with a slight reduction in shipping costs or tiny improvements in safety or enjoyment of some consumer product.

Perhaps you would also do more good by working in a slight increase in shipping costs.

Comment author: Dagon 19 July 2016 09:50:21PM 0 points [-]

Quite. Whatever you consider an improvement to be. Just don't completely discount small, likely improvements in favor of large (existential) unlikely ones.