When the self-driving car is snapped by the automated speed camera
Politicians worldwide are rewriting legislation to accommodate a future where the world's roads are populated by self-driving cars, but in doing so, they are uncovering a whole new set of thorny issues. One question that arises is, "If a self driving car gets a speeding ticket, who should pay it?"
This and other issues were raised in a UK parliamentary committee recently discussing the proposed Automated and Electric Vehicles Bill. Oliver Letwin asked what would happen if an autonomous vehicle was involved in an accident. Would the insurance companies pay for damage incurred etc? A representative of motor insurers said yes, they would, although it would be likely with an automated vehicle that it would be the vehicle which was insured, rather than the occupants. However, if the owner hadn't installed any safety-critical updates to the software, this would invalidate the insurance.
Letwin then asked what would happen if an automated vehicle was too slow responding to a change in signage on a smart motorway with variable speed limits, and incurred a fine from an automated speed camera? Would the insurance company also pay the fine? The insurance representatives said they would not support such a proposal. "Surely the passenger is not then liable", said Letwin. This is a very good point. The vision for automated vehicles is that we fallible humans have no input and are mere passive passengers operating the sat nav. So who pays the fines?
Robot vs Human is already happening
Legislation issues have already hit the courts in the USA, even though the automated cars are still merely testing prototypes. A biker is suing General Motors after he was knocked off his bike in San Francisco by one of their self-driving cars, a Chevrolet Bolt, saying it has resulted in neck and shoulder injuries. According to the police report, the Chevy identified a gap in the adjacent lane and attempted to move into it. However, sensing that the gap was not large enough, it returned to its own lane, and in doing so hit the motorcyclist, causing him to fall off the machine.
This sounds like six of one and half a dozen of the other, and GMs position is that the biker is at fault for "attempting to move into the lane before it was safe to do so". But if you were driving your car, and started to change lanes before realising you had misjudged the traffic, wouldn't you at least look to make sure it was safe before moving back into your own lane? Even if you thought another motorist had no right to be there, wouldn't you hit the brakes or do anything you could to avoid an accident, especially if it was a more vulnerable motorbike rider? With autonomous vehicles and their 360 degree awareness, how does this accident even happen?
You cannot automate for stupidity
Tesla cars have a feature called Autopilot which is Tesla's form of advanced cruise control. A forward facing camera can read speed limit signs, or use GPS data to work out speed limits and adjust speed accordingly. The camera can detect potential hazards at a distance of up to a quarter of a mile and apply emergency braking and steering. Autopilot also includes lane centering, automated parking, and even controls the lights at night. But for all that, it is really still just cruise control.
This month is California, a Tesla slammed into the back of stationary fire truck attending an accident on the freeway. By sheer luck, none of the fire crew were standing at the back of the truck so escaped injury, and the driver managed to walk way from the accident, suggesting that the Tesla had applied emergency braking, but when asked how he had not noticed the huge fire truck, the driver told police that the Autopilot was driving the car. Perhaps this Tesla owner should have read the manual and understood what the Auotpilot really does.
But if you think that is stupid, just a few days earlier, police in San Francisco found a driver stopped in the middle of Bay Bridge, passed out behind the wheel from too much alcohol. When they asked him why he was driving, he explained it was okay, it was a Tesla, he had it on Autopilot.
29th January 2018
This article comes from the SKILLZONE email newsletter, published monthly since January 2008, and covering topics related to technology and the internet. All articles and artwork in the SKILLZONE newsletter are orignal content. If you would like to receive the newsletter direct to your inbox each month, please SUBSCRIBE here. It is free, and you don't get added to any other mailing lists. It uses best-practice confirmed opt-in only, and you may unsubscribe at any time.