Tesla: The Public as Crash Test Dummies

By July 14, 2016Automotive, Telematics
Michelangelo's 'David'

The recent fatal collision involving a Tesla car while in Autopilot self-driving mode, followed by another major crash a week later, and multiple less dramatic rear-end collisions, are calling attention not only to the state of autonomous-driving technology itself but also to the public perception and trust in self-driving cars.

Developing autonomous driving capabilities that are safe under most conditions is proving to be as difficult and time consuming as some have predicted. Most manufacturers are taking a conventional path, adding driver-assistance features gradually and building toward full or near-full autonomy that they expect to mature by the end of this decade. But Tesla, famous for its willingness to challenge the status quo and take business and technology risks, has chosen a much faster, if riskier, route.

Tesla has been pushing to get advanced functionality, including safety features, on the road faster and more frequently. This strategy relies on software-driven functionality, wireless connectivity, and remote update capabilities to get feedback from vehicles and customers to continually fine-tune systems and features. But this approach may have led to a somewhat relaxed attitude of releasing software updates that weren’t fully tested and fool-proofed. Tesla’s loyal customers are apparently comfortable with this approach, but some may have a different attitude when they find themselves serving as crash test dummies.

Tesla’s attitude and response to the recent incidents center around the vehicle and Autopilot technology, and explain why it had failed, causing death.  In a recent blog the company states: “This is the first known fatality in just over 130 million miles where Autopilot was activated”; and it continues to describe the investigation as seeking to “determine whether the system worked according to expectations,” with the well-advertised caveat “Always keep your hands on the wheel. Be prepared to take over at any time.”

A Matter of Trust

Autonomous and semi-autonomous driving technology offers more than time-saving convenience. Think, for example, about its tremendous potential impact on the well-being of the disabled and the elderly. It will no doubt reduce the number and the severity of car crashes in a very meaningful way. But to get to this point, this technology must gain the trust of the public: drivers, passengers and pedestrians that share the roads with Tesla drivers on Autopilot.

Trust doesn’t equate interest and even excitement about future technologies. Google’s autonomous car generates excitement but its fairground-ride look doesn’t necessarily evoke trust. Even one fatality erodes trust in the technology and even more so in the company responsible for it. This trust is based on a complex confluence of technology maturity, relevant regulations and public perception.

So questions whether self-driving technology is ready to be released to just any driver on public roads will continue to linger.

But the public will not agree to take the role of crash test dummies.


Image: Michelangelo’s ‘David’ by Eduardo Paolozzi (1987)

  • following some debates on LinkedIn (https://www.linkedin.com/pulse/tesla-public-crash-dummies-joe-barkai), I wanted to add the following:

    Tesla is constantly in the limelight, usually for its innovation and maverick spirit. And, by the way, I consider the company’s effort to change the status quo in retail operations as important as its technology innovation. But in this case they may have gone a little too fast.

    The point I’d like to emphasize is the need for public trust that will drive adoption, which, in turn, will further improve the technology and lower its cost so it is available to all. Tesla’s blog response may have not done enough in this direction.

    My recommendation for the past several years has been to promote the use of autonomous cars for special applications and under safer and better controlled environments than public roads. Consider, for example, large company and university campuses, transfers between airport terminals and to rental car locations, small semi-closed neighborhoods, etc. This will accelerate the maturation of the technology and advance public trust.

  • Tesla is facing its first U.S. legal challenge over self-driving technology in a case alleging the Autopilot software in 47,000 Model S and Model X vehicles is “dangerously defective” when engaged.

    The case was filed as a class action in San Jose, CA, federal court on behalf of Tesla owners who bought their vehicles during the two quarters ending March 31. The plaintiffs claim their cars’ safety features are either non-functioning or unsafe to use. “Unwittingly, buyers of affected vehicles become beta testers of half-baked software that renders Tesla vehicles dangerous if engaged,” the complaint alleges. According to the complaint, activating Autopilot would cause vehicle to sometimes veer out of lanes, “lurching, slamming on the brakes for no reason, and failing to slow or stop when approaching other vehicles”.

    In a statement, Tesla said “This lawsuit is a disingenuous attempt to secure attorney’s fees posing as a legitimate legal action, which is evidenced by the fact that the suit misrepresents many facts. Many of the features this suit claims are “unavailable” are in fact available, with more updates coming every month.”