Inference Speed is Not Unbounded
[The intro of this post has been lightly edited since it was first posted to address some comments. I have also changed the title to better reflect my core argument. My apologies if that is not considered good form.] This post will be a summary of some of my ideas on what intelligence is, the processes by which it’s created, and a discussion of the implications. Although I prefer to remain pseudonymous, I do have a PhD in Computer Science and I’ve done AI research at both Amazon and Google Brain. I spent some time tweaking the language in order to minimize how technical you need to be to read it. There is a recurring theme I've seen in discussions about AI where people express incredulity about neural networks as a method for AGI since they require so much "more data" than humans to train. On the other hand, I see some people discussing superintelligences that make impossible inferences given virtually no input data, positing AI that will instantly do inconceivable amounts of processing. Both of these very different arguments are making statements about learning speed, and in my opinion mischaracterize what learning actually looks like. My basic argument is that the there are probably mathematical limits on how fast it is possible to learn. This means, for instance, that training an intelligent system will always take more data and time than might initially seem necessary. What I’m arguing is that intelligence isn’t magic - the inferences a system makes have to come from somewhere. They have to be built, and they have to be built sequentially. The only way you get to skip steps, and the only reason intelligence exists at all, is that it is possible to reuse knowledge that came from somewhere else. Three Apples and a Blade of Grass Because I think it makes a good jumping off point, I’m going to start by framing this around a recent discussion I saw around a years-old quote from Yudowsky about superintelligence: > A Bayesian superintelligence, hooked up to a we
I actually say to myself “alright, someone whose good at math come solve this” which I guess kind of puts my internal monologue into math mode.