I don't understand it, but from reading the wikipedia summary it seems to me it measures a complexity of the system. A complexity is not necessarily consciousness.
According to this theory, what is the key difference between a human brain, and... let's say a hard disk of the same capacity, connected to a high-resolution camera? Let's assume that the data from the camera are being written in real time to pseudo-random parts of the hard disk. The pseudo-random parts are chosen by calculating a checksum of the whole hard disk. This system obviously is not conscious, but seems complex enough.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
In general, picking the highest EV option makes sense, but in the context of what sounds like a stupid/lazy economics experiment, you have a moral duty to do the wrong thing. Perhaps you could have flipped a coin twice to choose among the first 4 options? That way you are providing crappy/useless data and they have to pay you for it!
Why do I have a moral duty to do wrong thing? Shouldn't I act in my own self interest to maximise the amount of money I make?