Report: Tesla Autopilot Crash Driver Ignored Warnings
A man killed when his semi-autonomous Tesla car crashed ignored multiple warnings issued by the vehicle, newly-released documents show. The papers also suggest subsequent changes to the way the car's Autopilot feature works should prevent a repeat of the incident.
Tesla's Autopilot feature is a brand name that covers a technology that's often classed as being "driver assistance," rather than strictly self-driving. The features in question covered a cruise control that kept the car at a constant speed while taking account of nearby vehicles, and an auto-steer function that kept the car within its lane.
Joshua Brown died last year after crashing into a tractor-trailer at more than 70 miles per hour. The crash drew media attention, as it was the first known case of such a fatal crash with a car that used at least some form of self-driving technology. (Source: ntsb.gov)
The Washington Post notes that Tesla vehicles clocked up more than 130 million miles of driving under the Autopilot feature before its first fatal crash. In contrast, the United States suffers a fatal crash every 100 million miles, according to Insurance Institute for Highway safety. (Source: washingtonpost.com)
No Conclusions In Case
The National Transport Safety Board has now issued a docket of documents containing the facts it gathered about the case. The docket doesn't contain any analysis, conclusions or recommendations as these will follow when the investigation is complete.
The documents show Brown drove for 41 minutes before the crash, of which 37 minutes involved the Autopilot feature. While using Autopilot, Brown had his hands consistently off the wheel, against Tesla's advice.
13 Warnings Before Crash
During this time, the car gave six audible warnings that Brown needed to put his hands back on the wheel. The car dashboard also gave seven visual warnings.
Since the crash, Tesla has made several changes to Autopilot. It now issues a warning if the driver has their hands off the wheel for more than a minute while traveling at 45 miles per hour; if the driver ignores three such warnings within a one-hour period, Autopilot is switched off and can't be reactivated until the car has been parked and the engine stopped.
Tesla has also removed the option to set Autopilot at above the prevailing speed limit, the only exception being on freeways where there's a barrier between lanes running in opposite directions.
What's Your Opinion?
Is driver assistance technology safe enough for use on public roads? Should Tesla have stopped people being able to ignore safety warnings from the beginning? Is there any point having cars that can at least partially drive themselves if drivers have to keep their hands on the wheel?
Most popular articles
- Which Processor is Better: Intel or AMD? - Explained
- How to Prevent Ransomware in 2018 - 10 Steps
- 5 Best Anti Ransomware Software Free
- How to Fix: Computer / Network Infected with Ransomware (10 Steps)
- How to Fix: Your Computer is Infected, Call This Number (Scam)
- Scammed by Informatico Experts? Here's What to Do
- Scammed by Smart PC Experts? Here's What to Do
- Scammed by Right PC Experts? Here's What to Do
- Scammed by PC / Web Network Experts? Here's What to Do
- How to Fix: Windows Update Won't Update
- Explained: Do I need a VPN? Are VPNs Safe for Online Banking?
- Explained: VPN vs Proxy; What's the Difference?
- Explained: Difference Between VPN Server and VPN (Service)
- Forgot Password? How to: Reset Any Password: Windows Vista, 7, 8, 10
- How to: Use a Firewall to Block Full Screen Ads on Android
- Explained: Absolute Best way to Limit Data on Android
- Explained: Difference Between Dark Web, Deep Net, Darknet and More
- Explained: If I Reset Windows 10 will it Remove Malware?
My name is Dennis Faas and I am a senior systems administrator and IT technical analyst specializing in cyber crimes (sextortion / blackmail / tech support scams) with over 30 years experience; I also run this website! If you need technical assistance , I can help. Click here to email me now; optionally, you can review my resume here. You can also read how I can fix your computer over the Internet (also includes user reviews).
We are BBB Accredited
We are BBB accredited (A+ rating), celebrating 21 years of excellence! Click to view our rating on the BBB.
Comments
Self driving cars
In hindsight, I think Tesla could have done more to implement safety features (as they have now done so), rather than relying on audible alerts and dashboard warnings that could have prevented the crash.
As for whether cars should only be able to partially drive themselves - I think this is more of a legal standpoint. If cars were completely autonomous and a crash occurs, then car companies are likely going to be held liable, with plenty of bad press and safety questioned. If it's only 'partially' autonomous with the onus being on the driver, then car companies are essentially off the hook. I think it will remain that way until it can be proven (if ever) that self driving cars can never crash, though I am not sure how that will pan out if you have a mix of autonomous cars and cars which are not autonomous.
said before
WHO is responsible..
BEST thing they could of done...STOP CAR and turn all warning lights on..because the IDIOT ISNT PAYING ATTENTION..
I want an autonomous car...REALLY..I want to BLAME those repsonsible for programming for my death..
Im not driving, so I dont need a license, or insurance..