I was wondering whether things might be slightly different if you simulated batman-sapience by running the internal representation through simulations of self-awareness and decision-making, using one's own blackboxes as substitutes, attempting to mentally simulate in as much detail as possible every conscious mental process while sharing braintime on the subconscious ones.
Then I got really interested in this crazy idea and decided to do science and try it.
Shouldn't have done that.
Another monthly installment of the rationality quotes thread. The usual rules apply: