Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Tesla CEO Elon Musk announcing the Model X car, which came with autopilot features, last year. AP Photo/Marcio Jose Sanchez
rebuttal

Tesla says its self-driving feature wasn't to blame for fatal crash

It claimed that Autopilot’s safety record had crossed the “better-than-human” threshold.

TESLA SAYS IT did nothing wrong disclosing a fatal accident involving its car’s self-driving feature weeks after it happened.

Back in May, a man died in a crash in Florida while his Tesla car was on autopilot. The incident happened after his car’s cameras failed to distinguish the white side of a turning tractor/trailer from a brightly-lit sky, and the car didn’t activate its brakes.

Fortune published a report which criticised Tesla for announcing the news almost eight weeks after it happened, but not before it sold over $2 billion of company shares in the meantime. Another article said that three days after the crash occurred, CEO Elon Musk said it wasn’t ‘material’ information that Tesla investors needed to know.

Tesla criticised the publication saying it was “fundamentally incorrect” and that its self-driving feature was working as intended. It also claimed that its safety record had crossed the “better-than-human” threshold.

“In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic,” it said in its statement.

Given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though at this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society.

Fortune responded by saying it fully stands by its reporting and its articles.

Self Driving Car Death AP Photo / Mark Schiefelbein AP Photo / Mark Schiefelbein / Mark Schiefelbein

Tesla’s Autopilot feature requires drivers to keep their hands on the wheel at all times. The setting is designed for conditions that are relatively predictable like driving in a traffic jam or on the motorway.

The system uses a combination of features like GPS, a forward-facing camera, 360-degree ultrasonic sensors, forward radar, and mapping data to determine its position and help it navigate.

When the software was first announced by Tesla back in October, Musk advised drivers to be cautious when using it as they would be liable if they got into a crash.

“It should not hit pedestrians, hopefully. We want people to be quite careful [when using it],” he said at the time.

Read: This malware infected more than 10 million Android devices >

Read: Blackberry is killing off its last traditional keyboard phone >

Your Voice
Readers Comments
25
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.