Autonomous driving technology may never go far enough to be safe

Autonomous+driving+technology+may+never+go+far+enough+to+be+safe

Craig Watkins, Opinion Editor

California is close to adopting new transportation rules that would allow driverless cars to carry passengers as long as they meet some heavy conditions. This is unsurprising from the state most associated with advancing technology and being an early adopter of trends, but the surprising part is in the timing.

Less than a month ago, the autonomous driving industry was hit with its biggest yet most expected setback yet: a pedestrian was killed by an autonomous vehicle.

The death of Elaine Herzberg in Tempe, Arizona, directly led to the voluntary suspension of Uber’s driverless vehicle tests and brought forth the question of whether this technology was ready for use in populated areas.

The vehicle that struck Herzberg was not technically driverless as all of Uber’s vehicles in the test had backup drivers. The backup driver’s job is to stay alert and ready to take the wheel should the computer driving the car make a mistake, which happens quite frequently.

Blame for the incident has been pointed towards Rafaela Vasquez, the backup driver behind the wheel of Tempe’s killer taxicab.

Footage from the Uber vehicle shows that when Herzberg was struck, Vasquez was looking into her lap instead of at the road ahead. While an alert backup driver would have prevented the collision, this incident shows that a backup driver is not always going to be alert.

Uber backup drivers are instructed to have their eyes on the road and hands hovering above the steering wheel at all times, but this did not happen. Take a short drive on any busy highway and you will find numerous people who could not be trusted to follow those instructions.

Most of them will not become test backup drivers any day soon, but if autonomous driving technology gains enough popularity then they may one day own a driverless vehicle that they must be responsible for.

The technology that powers potential driverless cars in the future would hopefully be more reliable than that in Vasquez’s vehicle, but that does not mean it will be infallible.

Based on how well-lit the road was where Herzberg was struck and other Uber backup drivers’ accounts of how sensitive the autonomous vehicles were to stopping, it seems like there was no reason for the car not to brake on its own. The trouble is how unpredictable machines can sometimes be.

Printers have been around far longer tahn self-driving cars, but I have yet to find one that works properly. Until I find a printer that can stay connected to my computer and only say it’s low on ink when it’s low on ink, I will not trust my life to a machine without a human in control of it.

Luckily even our nation’s legal system agrees with me. Evidence from DNA and fingerprint analysis have been more prevalent in the past several years because computer databases can more quickly compare evidence to samples obtained from police departments across the country.

This evidence gathered with the help of computers is often used to decide whether someone is guilty or innocent of a crime and could be the deciding factor in whether someone is executed or not. Yet this evidence requires an expert to verify the computer’s findings.

Every time you set foot or wheel onto a road, you are risking dying to do so. If a machine that determines whether one lives or dies requires an expert behind it in one case, it should in every case. If that drive I told you to take is any indicator, most drivers are not experts.