A quick google search brings up "computation theory". This does not seem to be as precise as information theory, we cannot talk about "n units of computation" the same way we can talk about "m bits of information". In fact there does not seem to be a generally accepted fundamental unit of computation which we can talk about.
Computational complexity theory is well-developed but only talks about scaling.
Turing machines seem to be able to "hide" a lot of the work in the number of available symbols to go on the tape or the number of available states of the pointer.
Logic circuits haven't produced much and seem a bit arbitrary (why only OR, AND, and NOT gates?).
Seems like we need a theory of computation to qualitatively understand things like logical uncertainty.
If that's what you meant, it is rather unclear in the initial comment. It is, in fact, very important that we do not know what the sequence is. You could see it as the computation is to determine which book in the library of Babel to look at. There is only one correct book [though some are close enough], and we have to find that one [thus, it is a search problem.] How difficult this search is, is actually a well defined problem, but it simply has multiple ways of being done [for instance, by a specialist algorithm, or a general one.]
Of course, I do agree that a lookup table can make some problems trivial, but that doesn't work for this sort of thing [and a lookup table of literally everything is basically is what the Library of Babel would be.] Pure dumb search doesn't work that well, especially when the table is infinite.
Edit: You can consider finding it randomly the upper bound on computational difficulty, but lowering bound requires an actual algorithm [or at least a good description of the kind of thing it is], not just the fact that there is an algorithm. The Library of Babel proves very little in this regard. (Note: I had to edit my edit due to writing something incorrect.)