Last week, the National Highway Traffic Safety Administration, the agency whose mission is to "Save lives, prevent injuries, and reduce vehicle-related crashes," announced a recall of Tesla cars: “On Wednesday the National Highway Traffic Safety Administration announced a massive recall of Teslas equipped with Full Self-Driving Beta, the technology that enables vehicles to control some aspects of driving, such as turning and adjusting speed, in urban environments. The FSD package, which currently costs Tesla owners an additional $15,000 when they buy their cars, requires the driver to be watching the road at all times (although Tesla enthusiasts have figured out ways to trick the cars’ attention guardrails for years). The NHTSA recall affects over 360,000 Teslas with FSD, which is pretty much all of them”
One typo: "First, they found that firms and regulars" => regulators
I thought I'd offer some thoughts, at least from my experience.
This is actually a very well thought out and written article, so thank you for sharing. You've done an excellent job explaining what was recalled and typically how manufacturers use software to test their systems in a virtual world. And as far as validations, it is very true that the government relies on the manufactures to self-verify they meet all the government requirements. Since there are so many requirements spread over so many products, self-governing and documenting is the only way to validate these systems with any sort of speed, especially as changes are needed mid-production.
As for recalling software: Recalling software is nothing new. What has changed is that we can now do it over the air. Prior to this, the software would be fully validated before being released on the product. If there was an issue, the customer would go to the dealership and get the new software. It functioned like any other recall. Secondly, this is not the first software recall that Tesla has had. This is what I have found especially concerning with over the air updates. While we can get some fun new features in the infotainment, these updates are changing safety critical systems like brakes and steering. See the example below:
While the government does have requirements that they trust the OEMs to validate for the sake of speed, these still take time to do. Software updates, to me at least, feel like they are moving at a speed faster than the time it takes to validate and document these tests, so I'm a bit skeptical on how elaborate Tesla's documentation is for each over-the-air update. Thirdly, it's hard to draw a line over calling a software update a "recall" or not. I would argue, anything updating the software of safety-critical systems would be subject to the term "recall". Is it affecting brakes or steering? Yes. Is it adding Netflix to your infotainment? No. Given this, I would think self-driving software should be treated no differently from the software that has traditionally controlled your steering, brakes, or engine/motor, as it is touching every one of those systems. Therefore, it is deserving of the title "recall".
Your paper says that the government is more likely to be averse to recalls including fires, melting parts, and lights. I've actually been apart of one of these. I like that you set up recalls as a reoccurring game, because during my experience, the firm I was working for at the time was deathly afraid of getting another fine from NHTSA and was willing to do whatever it took to fix the recall within the 60 day span NHTSA requires. Maybe it's anecdotal, but I have personally seen this reoccurring game work in the government's favor. In this case, the OEM self reported it to NHTSA within 24 hours of finding out about the issue.
Lastly, when it comes to reliability, I would think that self driving software, much like other software in the vehicle, would need be reliable to the point that the company should feel legally obligated if any issues occur. OEMs release both hardware and software knowing that there will be some failures, somewhere. It's a statistical inevitability. But.... this statistical inevitability should be quantified and used as measure of risk, both for the company and it's customers. Until the manufactures think they can handle the financial and public relations risk of a system failure, engineers should be testing and validating these systems, like they have for every other part of their vehicles in the past. And that is where I think Tesla has been ethically dubious, in that they are having customers pay to be beta testing a system for Tesla and potentially putting their safety at risk. They often do this hiding behind the legal obligation for their customers to be ready to take over the vehicle at any time. This is a level 2 system after all. But clearly, humans by nature don't react instantaneously to every situation.
I am 100% on board with autonomous technology and it's future potential, but I am hoping that this recall puts some caution into Tesla's FSD development team when it comes to pushing out updates. However, I feel it may require further recalls to let Tesla know of this is a repeated game with consequences.
is recall the right term for it? A software recall would mean that the software is disabled on user's device, and user would not be able to (permitted to) use it unless there's a bug fix / upgrade. Here, regulators can play a big role, as car makers (not only Tesla) expect to generate revenue from software, and disabled software can block their revenue stream. Besides, in such cases, regulators can also ask for proof of fix before enabling the features to ensure public safety