leplen comments on New report: Intelligence Explosion Microeconomics - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (244)
I have some questions about the math in the first couple pages, specifically the introduction of k. I'm not totally sure I follow exactly what's going on.
So, my assumption is that we're trying to model AI capacity as a function of investment, and I assume that we're modeling this as the integral of an exponential function of base k such that
with k held constant. The integral is necessary I believe to insure that the derivative of C is positive in both the k<1 and k>1 scenarios. This I believe matches the example of the nuclear chain reaction. I note here that C as I've defined it, is only a function of investment and tells us nothing about time or any other variable. I think it's also true that we've defined C as an exponential because we're assuming that the AI is reinvesting it's returns. This seems to conflict with the linear relationship between investment and returns mentioned in the Chalmer's quote
although perhaps those deltas are not intended to be quantitative and equal.
But even then, I'm a little uncertain that my relation is correct. It is not clear to me that the sequence of logarithms obtained in the k<1 case is a result of this function. Specifically, I thought the notion of reinvestment was the motivation for choosing an exponential/logarithmic function to start with, and so I'm not clear on why reinvestment suddenly changes the behavior to that of nested logarithms. Is the logarithmic nature of our return being double counted?
I was also confused by the statement
But from my model, which I think is the correct one, this isn't true. I feel like I understand the math from the nuclear chain reaction, but I have
so that k=1 implies not exponential growth, but linear growth. Even worse, no value of k in my model is capable of making k "explode above" the exponential. I agree with the assessment that k has been slightly on the positive side, which gave me some hope I still have the correct model, but then I got really discouraged by the fact that
for money is on the order of 1.02 while
for the neutrons in the nuclear pile was 1.006. The implication from k values alone is that my bank account is somehow more explosive than a large pile of Uranium. Unfortunately this is not true, and so it seems like my model needs to account not only for C as a function of i, but C as a function of time as well.
This issue really comes into play with the prompt critical AI. One of the ways prompt critical AI is deemed capable of growing exponentially smarter is by stealing access to more hardware. Having this as an option challenges either the definition of investment or seriously challenges the notion of constant k. Even in the limit that solving AI problems is exponentially hard (k<1), if gaining access to more resources/investment is not also exponentially hard, then the AI isn't bounded by how hard it is to solve AI, but by how hard it is to acquire resources. An intelligence fizzle AI that could increase its access to computer time faster than it progressed up the difficulty curve would show exponential growth even IF building an AI was exponentially difficult. Additionally, it's not clear to me that the characterization here only in terms of k, (i.e. return on cognitive investment) and without any regard to time in the original formulation is particularly clear, especially since time dependence seems to be the only difference between scenarios 2 and 3. (Indeed, what is "prompt k" if not k>1 coupled to a short generation time?
I'm really terrible at LW formatting/writing in tiny comment boxes, so if I screwed this up to the point of being confusing let me know.