Take Teslas, for example. In particular, take the neural nets driving this car here. How many parameters do they have? I just need a ballpark estimate, within one order of magnitude would be great. If not for Tesla, then for Waymo or some such.

(My google-fu is failing me, though it does turn up the stat that Tesla's hardware is capable of 144 TOPS, which sounds like 1.44x10^14 operations per second.)

New Answer
New Comment

2 Answers sorted by

eg

100

Take with grain of salt but maybe 119m?

Medium post from 2019 says "Tesla’s version, however, is 10 times larger than Inception. The number of parameters (weights) in Tesla’s neural network is five times bigger than Inception’s. I expect that Tesla will continue to push the envelope."

Wolfram says of Inception v3 "Number of layers: 311 | Parameter count: 23,885,392 | Trained size: 97 MB"

Not sure what version of Inception was being compared to Tesla though.

Thanks!

I wonder whether it would suddenly start working a lot better if they could e.g. make all their nets 1000x bigger...

jbkjr

60

I think they do some sort of distillation type thing where they train massive models to label data or act as “overseers” for the much smaller models that actually are deployed in cars (as inference time has to be much better to make decisions in real time)… so I wouldn’t actually expect them to be that big in the actual cars. More details about this can be found in Karpathy’s recent CLVR talk, iirc, but not about parameter count/model size?