siodine comments on Interview with Singularity Institute Research Fellow Luke Muehlhauser - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (65)
What do you mean by 'value space' (any human values or desires?) and 'moral change' (generally desirable human values?)? Also, a godlike AI is like playing with infinities when inserted in your moral calculus; a godlike AI leading to horrible consequences per your morality doesn't show that your morality was fully flawed (maybe there was a tiny loophole no human would try or be capable of exploiting). And I think you discount the possibility that there are many moral solutions like there are many solutions to chess (this is especially important when noting the impact of culture on values).
This is a possible case of a currently ongoing mild value change in the US:
Is this "generally desirable human values"? Depends on the values you already hold. I certainly think radically undesirable moral change might occur, looking from the value sets of past humans we see it almost certainly would not be a freak occurrence.
My key argument is that we have very little idea bout the mechanics of moral change in real human societies. Thus future moral change is fundamentally unsafe from our current value set and we do not have the tools to do anything about it. Yet. Once we get them excluding us realizing the process we are currently chained to has some remarkable properties we like, we will probably do away with it.
If moral progress is a salvageable concept then we shall see it for the first time in the history of mankind. If not we will finally do away with the tragedy of being doomed to an alien future devoid of all we value.
Of course "we" is misleading. The society that eventually gets these tools might be one that has values that are quite worthless or even horrifying from our perspective.