A quick google search brings up "computation theory". This does not seem to be as precise as information theory, we cannot talk about "n units of computation" the same way we can talk about "m bits of information". In fact there does not seem to be a generally accepted fundamental unit of computation which we can talk about.
Computational complexity theory is well-developed but only talks about scaling.
Turing machines seem to be able to "hide" a lot of the work in the number of available symbols to go on the tape or the number of available states of the pointer.
Logic circuits haven't produced much and seem a bit arbitrary (why only OR, AND, and NOT gates?).
Seems like we need a theory of computation to qualitatively understand things like logical uncertainty.
Such a table cannot really be created because it is too large, not just for computing, but even for storing in memory if it were somehow given to you. It is not out of the question that computing resources continues to grow enough such that it eventually becomes feasible, but we have no idea if they will, and it would be a long time in the future.
Theoretical Turing machines are very simple, but have infinite resources, and are thus a bad way of determining the difficulty of things.