There are several problems here, IMO, but one is that AIXI is being taken rather too seriously.
AIXI is a RL agent with no real conception of its own aims - and so looks as though it will probably wirehead itself at the first available opportunity - and so fail to turn the universe into paperclips.
Wouldn't a wirehead AI still kill us all, either in order to increase the size of the numbers it can take in or to prevent the feed of high numbers from being switched off?
Link: aleph.se/andart/archives/2011/02/why_we_should_fear_the_paperclipper.html