I find it particularly important because of the example of automating research, which is probably the task I care most about.
Neither math research nor programming or debugging are being taken over by AI, so far, and none of those require any of the complicated unconscious circuitry for sensory or motor interfacing. The programming application, at least, would also have immediate and major commercial relevance. I think these activities are fairly similar to research in general, which suggests that what one would classically call the "thinking" parts remain hard to implement AI.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Yes, but I don't think that's relevant. Any use of complexity depends on the language you specify it in. If you object to what I've said here on those grounds, you have to throw out Solomonoff, Kolmogorov, etc.
More specifically, it seems that your c must include information about how to interpret the X bits. Right? So it seems slightly wrong to say "R is the largest number that can be specified in X bits of information" as long as c stays fixed. c might grow as the specification scheme changes.
Alternatively, you might just be wrong in thinking that 30 bits are enough to specify 3^^^^3. If c indicates that the number of additional universes is specified by a standard binary-encoded number, 30 bits only gets you about a billion.