Tesla Autopilot Error Leads To First Fatality

tesla-autopilot

The first recorded fatality in a Tesla Model S with the autopilot system activated has been recorded in central Florida. The driver in question, Joshua Brown from Ohio, was known to regularly post videos involving his Tesla and had in fact posted one showing him having a near accident with the autopilot engaged.

Tesla revealed this information through a blog post and mentioned that National Highway Transportation Safety Administration was looking into the incident.

According to the company, the Autopilot system did not recognize the trailer which was perpendicular to the car mistaking it for an overhead sign under which the car would have passed. The odd positioning of the billboard along with the brightly lit overhead sun combined to fool the Autopilot system in making this mistake.

The Model S regularly receives the highest possible points on safety tests, however, this impact involving the windshield and the trailer was not something that the car was built to withstand.

There are several questions that will be raised by this incident, not the least being whether Tesla was right in deploying software that according to its own admissions is still in beta phase? Releasing unfinished software to the public may be a done thing in consumer electronics where the stakes are not so high but has the potential to be fatal in a car. And it just was.

Tesla pointed out that the Autopilot system requires the driver to be alert at all times and have their hands on the wheels.  They also shared statistics showing that the Autopilot system had been used for more than 130 million miles before this fatality, which is far above the American average of one fatality per 94 million miles.

It will be interesting to see how regulators approach this since this is an entirely new territory. Does Tesla absolve itself of all blame by saying that it informed people its software was not be trusted completely? Or does the driver take all responsibility for engaging the Autopilot and not using it as directed by the company?

The system that Tesla is using is not very advanced and has been developed by a number of other companies like GM. They have chosen not to put it on the roads until they have fine tuned it further. This could be either because their software is not at the same place where Tesla’s is or it could be because of the fact that they have a lot many more cars on the road than Tesla and thus their liability is more or it could just a case where Tesla was too hasty in adding this feature to their cars.

Volvo, one of the pioneers in self-driving technology has publicly stated that it will take full responsibility for its automation software once it starts installing it in cars for public consumption.

The debate on who was ‘driving’ the car is one that will divide experts for sure. The decision that the NHTSA come to will set an important precedence for the future.

Leave a Reply

Your email address will not be published. Required fields are marked *