Douglas_Reay comments on Not Taking Over the World - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (91)
OK, sure. Concreteness is good. I would say the first step to putting numbers on this is to actually agree on a unit that those numbers represent.
You seem to be asserting here that the proper unit is weeks of life (I infer that from "willing to give up 6 weeks of their life to have a share of ownership of The Big Decision"), but if so, I think your math is not quite right. For example, suppose implementing the Big Decision has a 50% chance of making the average human lifespan a thousand years, and I have a 1% chance of dying in the next six weeks, then by waiting six weeks I'm accepting a .01 chance of losing a .5 chance of 52000 life-weeks... that is, I'm risking an expected value of 260 life-weeks, not 6. Change those assumptions and the EV goes up and down accordingly.
So perhaps it makes sense to immediately have the AI implement temporary immortality... that is, nobody dies between now and when we make the decision? But then again, perhaps not? I mean, suspending death is a pretty big act of interference... what about all the people who would have preferred to choose not to die, rather than having their guaranteed continued survival unilaterally forced on them?
There's other concerns I have with your calculations here, but that's a relatively simple one so I'll pause here and see if we can agree on a way to handle this one before moving forward.
How about QALYs ?