"For a true Bayesian, information would never have negative expected utility". I'm probably being a technicality bitch, attacking an unintended interpretation, but I can see bland examples of this being false if taken literally: A robot scans people to see how much knowledge they have and harms them more if they have more knowledge, leading to a potential for negative utility given more knowledge.
When I read Eleizer's posts on free will, and then spent time thinking it through myself, I came to the conclusion that the question was (non-obviously) incoherent, and that this is what Eleizer also thinks. More specifically, that when you Taboo free will you find yourself trying to say that your brain is not controlled by physics, rather "you" do, where "you" has to be not physics, which is really just "magic". I started thinking further about questions and realised that there are whole classes of questions for which neither "Yes" or "No" is the correct response. The most basic example is a contradiction. P = A is true & A is false.
"What would it feel if we actually had P?" "Uhh, hold on a second, that is nonsense." "I know we can't have P but what if we did! What would it feel like??"
I find this understanding incredibly valuable, because it is a traditional philosopher's downfall that they try to answer every question no matter what. You can fully explain the problem with the question yet still feel like there is a question. So the problem becomes psychological instead of philosophical and can be attacked by figuring out what our brain is doing when it tries to tackle the question.
Additionally, I think it is a Mind Projection Fallacy to ask a question like "What would infrared look/feel like if we could see it?" Stimuli don't necessitate particular perceptions. You could rewire your blue and red photoreceptors in order to feel blue, I mean, 420nm light, as red [untested]. I'm mildly confident that this can be extended to asking about how free/not free will would feel like. It's likely you can run with certain aspects of the idea without going loopy I still think the overall idea is wobbygobby.
Another mistake is thinking that being unpredictable gives you more free will. Either you are controlled by predictable atoms or controlled by unpredictable dice rolling mechanics, or what ever. Neither case gets you any closer to having "you" in charge. I even prefer being predictable rather than randomised. That shit's crazy.
Anyway, I expect I'm going to get a lot of disagreements stemming from us having different ideas about free will. Everyone struggles to formalise from the informal in different ways, so if you see something I've said as nonsense, ask yourself if it is just that we are using the same string of characters to point to different ideas.