The auto industry is buzzing after a reported crash earlier this month involving a Tesla Model Y engaged in “Full Self-Driving” mode. The crash is still under investigation, but if confirmed, it will be the first accident on record due to machine error. There were no reported injuries or fatalities in the crash, and it appears to have happened on an isolated road. Tesla has remained largely silent on the development, and it’s not expected for them to address the issue in the near future. According to their Q2 Vehicle Safety Report, exhaustive data indicated that vehicular accidents in autopilot occurred once for every 4.2 million miles. This is in comparison to the US average for manual drivers, which correlates 1 accident for every 484,000 miles. However, the company reports that when not in self-driving mode, Tesla drivers averaged one accident per 1.2 million miles. Has Tesla unlocked the key to fixing user error?
A Legal Conundrum
While the issue of safety and reliance upon emerging technology seems to dominate the global stage, the real issue at hand appears to be legal, if anything. Determining liability, especially in fatal crashes, is a gray area. The simple advice from a lawyer is to deflect fault, which is convenient for all parties except the victim. Rather than assuming crashes won’t happen, the more important question is: who is responsible? This question has been at the forefront of the West coast, where Amazon announced that they would begin testing their autonomous delivery fleet in Seattle. This announcement was met with swift pushback from city leaders and safety advocates, who point to recent crashes such as the one in Los Angeles earlier this month as a bad omen.
To Reduce Human Error or to Rely on Human Intervention
But Amazon points to the data, where statistics indicate that driverless cars are safer than human controlled. More accurately, Amazon points to Zoox, the company it’s outsourced for its autonomous fleet. According to Zoox, an employee will be present in the driver’s seat of every vehicle, with adequate training to take over in the event of machine error. But is the ability to manually override the system enough to prevent an accident? In the case of Los Angeles’ Model Y accident, the answer is unclear. It wasn’t reported if the driver was alert at the time of the crash, nor other imperative details such as cell phone usage or other potential distractions. However, as experts at Aaron Allison Law Firm stated, safety should be a priority – and this something all parties agree on.
A Cautionary Tale
If we assume that accidents will happen involving driverless cars, as they undoubtedly will, the legal ramifications will be a difficult field to navigate. More testing of autonomous vehicles, such as Amazon’s dry run scheduled in the near future, are critical to gain public acceptance and assurance of safety for the new technology. But accident or not, the State Department faces the challenge of what party to hold responsible–or regulate. Multiple companies are competing for real estate in the exclusive market, creating a new kind of road race. But none of them are rushing to take accountability with the government in case things go wrong.
We use cookies to enhance your browsing experience, serve personalised ads or content, and analyze our traffic. By using this Site or clicking "OK", you consent to our use of cookies.OK