Tesla's algorithm is supposed to be autonomous for freeways, not for highways with intersections, like this. The algorithm doing what it was supposed to do would not have prevented a crash. But the algorithm was supposed to eventually apply the brakes. Its failure to do so was a real failure of the algorithm. The driver also erred in failing to brake, probably because he was inappropriately relying on the algorithm. Maybe this was a difficult situation and he could not be expected to prevent a crash, but his failure to brake at all is a bad sign.
It was obvious when Telsa first released this that people were using it inappropriately. I think that they have released updates to encourage better use, but I don't know how successful they were.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
But isn't the fatal accident per mile is better predictive measure?
I don't think so; different types of car are bought by different people and driven differently. E.g. a person who buys a Lamborghini or Ferrari probably likes to drive fast and show off; a person who buys a Volvo probably drives a lot more carefully.