ZeroGravitas

https://twitter.com/Z3R0Gravitas

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

Very nice exploration of this idea. But I don't see mention, in the main post, and couldn't find by simple search, in the comments, any mention of my biggest concern about this notion of an AI development slowdown (or complete halt, as EY recently called for in Time):

Years of delay would put a significant kink in the smooth exponential progress charts of e.g. Ray Kurzweil. Which is a big deal, in my opinion, because:

(1) They're not recognised as laws of nature. But these trends, across so many technologies and the entire history of complexity, in the universe, have been so consistent, that diverting them, at will, feels like commanding the tide to not come in. It's probably less avoidable than we realise.

(2) If such a measure were somehow enacted, why wouldn't one expect civilizational collapse? Why should current society necessarily be able to be statically stable? All stagnant empires have historically collapsed. And the contemporary world is as tightly linked as one, with developments moving faster than ever, at bigger scale than ever. I feel sure there are a growing wake of known and unknown issues close behind, ready to wash over us all, if we come to a sudden slowdown.

Also, AI is not a single application to be suppressed, like any of the examples mentioned. Those, I'd liken to mushrooms, sprouting from the hidden fungal network of technological capabilities hidden out of sight. Whereas AI has already become a pervasive research tool underpinning so many continued advances.

While AGI looks set to jump in to extend the exponential upwards trend of ever more brains, now human population is plateauing. Preventing these being 'born' would, in my view, be equivalent to insanely harsh population controls. Or, a relative culling, compared to what's needed to continue on trend.

[Edit: I'd appreciate explicit critique of these thoughts, please.]