A quick google search brings up "computation theory". This does not seem to be as precise as information theory, we cannot talk about "n units of computation" the same way we can talk about "m bits of information". In fact there does not seem to be a generally accepted fundamental unit of computation which we can talk about.
Computational complexity theory is well-developed but only talks about scaling.
Turing machines seem to be able to "hide" a lot of the work in the number of available symbols to go on the tape or the number of available states of the pointer.
Logic circuits haven't produced much and seem a bit arbitrary (why only OR, AND, and NOT gates?).
Seems like we need a theory of computation to qualitatively understand things like logical uncertainty.
Turing machine encodings don't really "hide" much in the alphabet or state, since their information content is logarithmic. Large alphabets can be efficiently emulated with only logarithmic slowdown, that is alphabets of up to 2^N symbols can always be emulated with a TM of only 2 symbols running N times slower. Larger state tables are harder to efficiently emulate, but there are at least many universal Turing machines that can emulate any other Turing machine in time proportional to the information content of the state table.
Boolean circuits aren't really arbitrary. The exact set of operators doesn't actually matter much, since every set of operators can be (usually efficiently) emulated by every other universal set. The set {or, and, not} is familiar and can express every other boolean function. The singleton sets {nand} and {nor} are also commonly used. Either of these is also universal, with the only difference between them being the labelling of the states. In one sense, this is the simplest element of digital computation.
I'm not sure what you mean by needing a theory of computation to understand things like logical uncertainty. Can you expand on this a bit more?
Yes, I suppose I left out that you can determine that something can't be computed if you couldn't do it with a Turing machine. Proofs of impossibility are actually somewhat important.
Practical costs today are a shifting sand, but worthwhile for difficult and important tasks, while being useless on a number of other ones due to the difficulty of determining them. What algorithm, and what should be the reference computer? (Or reference computer building / buying process. It would be silly to include a massive R&D program, but what about a small one that ... (read more)