On May 7, 2016, 40 year old Ohio Resident Joshua Brown was driving a Tesla Model S with the Autopilot system on when he was involved in a fatal crash. This is the first fatal crash of the Model S when the Autopilot system was engaged. The National Highway Transportation Saftey Administration (NHTSA) is now investigating.
So What Went Wrong?
At the time of the crash, Mr. Brown was operating the Model S on a divided highway in Florida with the Autopilot engaged. A tractor trailer truck drove perpendicular to Mr. Brown’s travel path via a non-controlled access highway and due to the white side of the trailer against the bright lit sky, neither Mr. Brown nor the Autopilot noticed the truck so the brake was not applied, manually or automatically. Further, given the height of the trailer, the vehicle’s radar would have registered it as an overhead road sign which it has been taught to tune out to avoid false braking events. Given the unique impact of the trailer and the Model S, the Model S passed under the trailer with the bottom of the trailer impacting the windshield. The Tesla Team states in their blog that “Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would have likely have prevented serious injury as it has in numerous other similar incidents.”
Who is to Blame?
Ultimately the NHTSA will examine the design and performance of the automated driving systems and determine if there was any defect. The NHTSA will also gather additional information about all the vehicles in the accident and the circumstances surrounding the accident. The fact that the NHTSA is conducting an Evaluation does not automatically mean there was any issue with the Model S Autopilot. Further it should be noted that despite having the accident immediately reported to the NHTSA, they sat on the information until June 30th.
The media however has jumped at the chance to lambast Elon Musk and Tesla Motors which had sold $2 billion of shares in an offering on May 18th, only eleven days after the fatal accident. Further, the fatal accident was not disclosed to shareholders. There was dispute between Fortune Magazine and Elon Musk as to whether or not the fatal crash was “material” to shareholders. While the disclosure of the fatal crash had very little impact to the overall shares of Tesla Motors, it has been a topic of much discussion on the internet.
Tesla fired back in one of its blog posts that Fortune Magazine had mischaracterized its SEC filing regarding product liability. The filing actually states:
We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.
Product liability claims could harm our business, prospects, operating results and financial condition. The automobile industry experiences significant product liability claims and we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death. We also may face similar claims related to any misuse or failures of new technologies that we are pioneering, including autopilot in our vehicles and our Tesla Energy products. A successful product liability claim against us with respect to any aspect of our products could require us to pay a substantial monetary award. Our risks in this area are particularly pronounced given the limited number of vehicles and energy storage products delivered to date and limited field experience of our products. Moreover, a product liability claim could generate substantial negative publicity about our products and business and would have material adverse effect on our brand, business, prospects and operating results. We self-insure against the risk of product liability claims, meaning that any product liability claims will have to be paid from company funds, not by insurance.
But is this really a question of product liability? Tesla Motors states that this was the first fatality in over 130 million miles where the Autopilot was activated. In the US alone, there is a fatality every 94 million miles. Worldwide, the rate increases to one fatality per 60 million miles driven. In short, customers using Autopilot are statistically safer than those not using it at all.
Tesla further asserts that “Given the fact that the “better than human” threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.”
It was never in question that at some point there would be a fatal crash involving the Autopilot. It was a statistical inevitability.
If the Autopilot had accurately acknowledged the truck crossing the road, the accident may have occurred and Tesla would have accumulated one more instance in which the Autopilot proved to be safer and more reliable than a human driver. However, there is no indication that its failure to recognize the truck made the accident any worse than had a regular human driver been operating the car. Just as human beings make errors in judgement on the road, so too can these automated systems which are still in testing phases make errors which need to be continually adjusted for every new circumstance that arises.
Unless the NHTSA’s investigation determines that there is an inherent defect with the Autopilot system or that the circumstances surrounding the crash were not as have been previously reported to the media, I do not see how Tesla Motors can be held liable for this tragic and unfortunate crash.
Is this just the tip of the iceberg? Fortune Magazine reported on July 6th that the NHTSA is now investigating a second auto-pilot related crash from Pennsylvania which occurred on July 1st. The Tesla Model X struck a turnpike guard rail then crossed several traffic lanes and landed on its roof in the middle of the roadway. Both driver and passenger were injured. No further details have been reported.