"What would I do if I were wiser?" may not be a meaningful question. Your current idea of wisdom is shaped by your current limitations.
At least the usual idea of wisdom is that it's acquired through experience, and how can you know how more experience will affect you? Even your idea of wisdom formed by observing people who seem wiser than yourself is necessarily incomplete. All you can see is effects of a process you haven't incorporated into yourself.
I know Wei Dai has criticized CEV as a construct, I believe offering the alternative of rigorously specifying volition *before* making an AI. I couldn't find these posts/comments via a search, can anyone link me? Thanks.
There may be related top-level posts, but there is a good chance that what I am specifically thinking of was a comment-level conversation between Wei Dai and Vladimir Nesov.
Also feel free to use this thread to criticize CEV and to talk about other possible systems of volition.