Take Teslas, for example. In particular, take the neural nets driving this car here. How many parameters do they have? I just need a ballpark estimate, within one order of magnitude would be great. If not for Tesla, then for Waymo or some such.
(My google-fu is failing me, though it does turn up the stat that Tesla's hardware is capable of 144 TOPS, which sounds like 1.44x10^14 operations per second.)
Take with grain of salt but maybe 119m?
Medium post from 2019 says "Tesla’s version, however, is 10 times larger than Inception. The number of parameters (weights) in Tesla’s neural network is five times bigger than Inception’s. I expect that Tesla will continue to push the envelope."
Wolfram says of Inception v3 "Number of layers: 311 | Parameter count: 23,885,392 | Trained size: 97 MB"
Not sure what version of Inception was being compared to Tesla though.
Thanks!
I wonder whether it would suddenly start working a lot better if they could e.g. make all their nets 1000x bigger...