Kingreaper comments on Should humanity give birth to a galactic civilization? - Less Wrong

-6 [deleted] 17 August 2010 01:07PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (85)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 17 August 2010 04:04:47PM 4 points [-]

Thanks for posting. Upvoted.

I have always had an uncomfortable feeling whenever I have been asked to include distant-future generations in my utilitarian moral considerations. Intuitively, I draw on my background in economics, and tell myself that the far-distant future should be discounted toward zero weight. But how do I justify the discounting morally? Let me try to sketch an argument.

I will claim that my primary moral responsibility is to the people around me. I also have a lesser responsibility to the next generation, and a responsibility lesser yet to the generation after that, and so on. A steep discount rate - 30% per generation or so. I will do my duty to the next generation, but in turn I expect the next generation to do its duty to the generation after that. After all, the next generation is in a far better position than me to forsee what problems the generation after that really faces. Their efforts will be much less likely than mine to be counterproductive.

If I were to spread my concern over too many generations, I would be shortchanging the next generation of their fair share of my concern. Far-future generations have plenty of predecessor generations to worry about their welfare. The next generation has only us. We mustn't shortchange them!

This argument is just a sketch, of course. I just invented it today. Feedback is welcome.

Comment author: Kingreaper 17 August 2010 11:51:51PM *  1 point [-]

I don't think you need any discounting. Your effect on the year 2012 is somewhat predictable. It is possible to choose a course of action based on known effect's on the year 2012.

You effect on the year 3000 is unpredictable. You can't even begin to predict what effect your actions will have on the human race in the year 3000.

Thus, there is an automatic discounting effect. An act is only as valuable as it's expected outcome. The expected outcome on the year 1,000,000 is almost always ~zero, unless there is some near-future extinction possibility, because the probability of you having a desired impact is essentially zero.