Self-Driving Uber Car Kills A Pedestrian
Cars are becoming smarter every day. Parallel parking assist, emergency braking, proximity alerts – these features are common in newer models. The next step, of course, is self-driving cars. Manufacturers have been testing self-driving, or fully autonomous, cars for years. It is only natural that autonomous vehicle technology and taxi services should eventually work together. Arizona is at the forefront of autonomous vehicle testing, with over 600 self-driving vehicles currently on the roads.
In March, the state of Arizona even eliminated the requirement that autonomous vehicles have a human driver on board. California was not far behind, allowing tests of completely driverless vehicles beginning in April of this year. Both states are competing for the economic windfall that autonomous vehicle manufacturing facilities could bring. Arizona, according to Kevin Beisty of the Arizona Department of Transportation, believes in “being part of the innovation and trying to stay out of the way of innovation.”
A recent fatal accident involving an autonomous Uber vehicle has slightly cooled the enthusiasm for forging ahead with more driverless testing. Arizona has placed a temporary halt to driverless vehicle testing while the accident is being investigated. The incident has raised questions about not only the safety of self-driving cars, but also where the responsibility lies when an accident does happen.
The Uber vehicle that struck and killed a pedestrian in Arizona did have a human driver aboard. Somehow, both the vehicle’s object recognition software and the human safety driver failed to see the pedestrian in time for the vehicle to stop. When tests resume, as they eventually will, more accidents are likely. Personal injury lawyers and courts will have to sort out the liability issues. In the Arizona case, for example, several parties may possibly bear responsibility for the accident. The safety driver, of course, did not see the pedestrian and engage the brakes. Preliminary investigations suggest that the vehicle’s object recognition software was set too high, causing the vehicle to see the pedestrian as an obstacle that could safely be ignored. In that case, responsibility might rest with the manufacturer, the software designer, the programmer, and even with Uber.
As with all new technology, there will be some who are quick with an “I told you so!” It is, of course, inevitable that autonomous vehicles will be involved in, and cause accidents. Human beings also cause accidents every day. At least a self-driving car can’t get drunk, text, or decide in a fit of rage, to deliberately hit someone. A driverless vehicle cannot discriminate against passengers because of their race, and cannot harass female passengers, as has been alleged in a class action suit against Uber. Nevertheless, for the foreseeable future, every accident involving an autonomous vehicle will be scrutinized carefully.
We can rest assured that insurance companies, attorneys, judges, and legislators will be able to work through the liability issues and come up with policies and regulations to govern autonomous vehicles. Meanwhile, testing should continue, and we can all look forward to the day when having to make small talk with your Uber or taxi driver is a thing of the past.
Author: Brian Magnosi