Search 3500+ EV News articles

Saturday, July 2, 2016

First Tesla Autopilot fatality brings beta-testing into question

As has now been widely reported, a fatal accident involving a Tesla Model S on Autopilot and a tractor-trailer happened in May in Florida killing the Tesla driver at the scene.

The Tesla was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. The explanation given by Tesla is that neither the Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road caused the Model S to pass under the trailer, impacting the windshield of the Model S and removing the entire roof section of the vehicle killing the driver instantly.

Facts gathered from video reports at the scene and eye witness accounts of the accident:

  • The posted speed limit was 65 mph
  • Another motorists witnessed the Model S travelling well in excess of 85 mph.
  • The intersection was over a slight crest in the direction the Model S was travelling.

    While it was a long straight section of road, travelling at very high speed on a road with a large number of uncontrolled intersection increases the risk of a collision significantly (hence the relatively low posted speed limit) The slight crest would also have hindered the visual range for both drivers.

    It seems quite reasonable to expect that the investigation will conclude the primary cause of the accident was excessive speed by the Tesla driver. Records obtained by The Associated Press show the Tesla driver Joshua Brown was cited for speeding seven times in Ohio between 2010 and 2015 and once in Virginia.

    Mobileye, an Israel-based tech company developing some of the technology behind Tesla’s Autopilot, issued a Statement on Fatal Tesla Model S Autopilot Crash:

    "We have read the account of what happened in this case. Today's collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."

    But now Tesla and Mobileye disagree on lack of emergency braking with Tesla issuing the following statement:

    "Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature."

    This seems to be a rebuke to Mobileye, a supplier of some technology used in the Autopilot and other driver assistance systems in the Model S. The Mobileye system is not designed for Lateral Turn Across Path (LTAP) detection and the Tesla part of the system failed because "the high, white side of the box truck" — that apparently failed to cause an interruption of the ground plane as mentioned above — "combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire."

    Either way, it seems clear that Tesla's Autopilot public beta testing needs to be restricted to use on roads without uncontrolled intersection (e.g. expressways) until 2018 when LTAP capability becomes available and the system can deal with lateral traffic!

    This fatal accident is an auto industry nightmare come true and has brought the whole issue of beta testing automotive road safety features with the general public into question. Many in the automotive industry have previously criticised Tesla - from former Google scientist Andrew Ng calling them "irresponsible" to Volvo’s research and development chief, Dr. Peter Mertens, saying "“Anyone who moves too early is risking the entire autonomous industry" and Jaguar XF project manager Stephen Boulter saying "If something happens [with Autopilot], it could set the technology back a decade" while BMW CEO, Harald Kr├╝ger said "We can offer automated driving on the motorway up to 120 kilometers per hour,” to which he continued “But our technology must be 100 percent reliable."

    The Silicon Valley business model is built on shipping buggy beta code. "Move Fast and Break Things" is the motto over at Facebook while 100s of millions of Apple iPhone owners have updated iOS only to have to install a bug-fix patch just a few days later. This kind of approach clearly does not translate well to the automotive industry where, as we saw with Toyota's unintended acceleration crisis that resulted in the deaths of 89 people, 400 wrongful-death and personal injury cases and cost the company more than $2.5B in criminal penalties and class action settlements, buggy code in cars can kill people, and the automaker is rightfully held liable!

  • Blog Widget by LinkWithin

    No comments:

    Post a Comment