So you're bound to end up losing in this game, anyway, right? Negotiation in itself won't bring you any additional power over the coherent extrapolated volition of humanity to change the future of the universe. If others think very much unlike you, you need to overpower them to bring your values back to the game or perish in the attempt.
I don't understand your thinking here at all. Divergent values are not a barrier to negotiation. They are the raw material of negotiation. The barrier to negotiation is communication difficulty and misunderstanding.
Why do you think I lose?
I know Wei Dai has criticized CEV as a construct, I believe offering the alternative of rigorously specifying volition *before* making an AI. I couldn't find these posts/comments via a search, can anyone link me? Thanks.
There may be related top-level posts, but there is a good chance that what I am specifically thinking of was a comment-level conversation between Wei Dai and Vladimir Nesov.
Also feel free to use this thread to criticize CEV and to talk about other possible systems of volition.