If the general capabilities necessary for effective self-improvement or to directly get an AGI can be bridged without the apparent complexity of the brain structures that enable general intelligence in humans (just with memory, more data, compute and some algorithmic breakthroughs or even none), I wonder why those structures are not needed.
Sure, it's not necessary that a sufficiently advanced AI has to work like the brain, but there has to be an intuition about why those neural structures are not needed to at least create an autonomous utility maximizer if you are going to defend short timelines.
Octopus' brain(s) is nothing like that of mammals, and yet it is equally intelligent.
Yeah, but I would need more specificity than just giving an example of a brain with a different design.