Blame the machines
Two months ago, an Uber self-driving vehicle struck and killed a pedestrian in Tempe, Arizona, but the incident seems to have done little to dampen the political enthusiasm for robot cars.
The accident raised serious questions about the design of the software used in autonomous vehicles and the testing methods employed. The accident has been investigated by America's National Transportation Safety Board (NTSB), and its initial report contains some troubling details which are not at all reassuring.
Did the Volvo safety features fail?
The first question concerns the vehicle used by Uber which was a Volvo XC90, a large SUV which is packed full of safety features. It has a well-proven collision avoidance and emergency braking technology called City Safety which normally acts as a failsafe against driver error. This video, (which is over three years old and therefore shows the maturity of Volvo's system), demonstrates how well this system works, even at night, and even with pedestrians and cyclists.
Did the Vovlo system fail in Arizona? Why did it not hit the brakes when Uber's own system failed to detect the pedestrian? The NTSB report says that Volvo's safety features are disabled when the Uber software has control of the vehicle, but offers no explanation of why someone at Uber decided to turn off the failsafes.
Was the pedestrian invisible?
This also raises the question of why the Uber sensors did not detect the pedestrian, but according to the NTSB report, they did, a full six seconds before impact which would be when the vehicle was still well over 100 meters away from the pedestrian, more than the length of a football pitch. Data from the vehicle shows that the Uber system detected that the paths were converging and classified the pedestrian first as an unknown object, then as a vehicle, and finally as "a bicycle with varying expectations of future travel path". At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed, but did not apply the brakes. Why not? The following is the exact quote from the NTSB report:
"According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."
That is a truly shocking revelation, that someone at Uber took the decision to turn off life-saving emergency braking because it was causing the vehicle to behave erratically. If they could not run the vehicle properly with its safety features turned on, it had no business being on public roads.
Why didn't the safety driver intervene?
That still leaves the issue though that the car had a safety driver who was there to stop just this sort of accident. In this apsect, everyone who commented on the Uber dashcam video, myself included, owes the safety driver an apology. In the dashcam footage, we see her with eyes down, glancing up every few seconds, and apparently fiddling with a phone or tablet. Everyone knows the dangers of distracted driving, and a trained driving specialist should know better than most. However, the driver told the NTSB that she was using the Uber system for data logging. Again, from the NTSB report:
"The operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review."
Other companies which have been performing autonomous vehicle testing have had two people in the car, one being a highly trained driver, and the other, sitting in the passenger seat, is responsible for data logging and making notes. In Uber's case, someone took the decision to have one person perform both tasks, and furthermore, that their safety drivers were not what we would call an advanced driver in any sense of the word. In fact, the driver on the night of the accident is someone with several speeding and red light violations on her record, as well as a conviction several years ago for driving whilst her license was suspended. It has been revealed that Uber's recruitment policy for its safety drivers is the same as for any other Uber driver, which is "no more than three minor driving offenses in last three years".
The race to market
In any sane world, we would expect a lot of evidence of systematic testing and validation of a piece of software capable of driving itself and two tons of metal around city centres and along fast moving crowded motorways, yet politicians the world over are eager to be able to boast that their city or country is the world leader in autonomous cars. Back in 2015 when Uber announced the project and that it would be testing in Pittsburgh, the mayor of Pittsburgh told the world that the city had cut through red tape to become "a 21st-century laboratory for technology". Obviously that puts pedestrians and other motorists on the scale of lab rats.
The firms themselves are in a race to market because they know the first to market, whatever the product, gets all the publicity, often becomes a de facto standard, and has a huge commercial advantage. Too many software writers nowadays take the approach "Let's get it out there, and fix the bugs later". Deadlines are often driven by marketing plans, not software completeness.
This same race is happening in the automotive industry. Mobileye, which is Intel's project, says it will have autonomous vehicle electronics by 2021, and Volvo is working to a similar timescale with its own autonomous vehicles. That's only three years away. Cruise, a subsidiary of General Motors, has a more ambitious plan and aims to have autonomous taxis rolling off the production lines in 2019, and Audi is also targetting 2019 for its product. The most advanced prediction though comes from Waymo, (aka Google's self driving car project), which claims its driverless taxis will be picking up passengers before the end of the year.
30th May 2018
This article comes from the SKILLZONE email newsletter, published monthly since January 2008, and covering topics related to technology and the internet. All articles and artwork in the SKILLZONE newsletter are orignal content.