Commentary: To err is human, and robotic — the problem with self-driving cars - Frontline

Trending

Latest stock market news from Wall Street - CNNMoney

Wednesday 11 April 2018

Commentary: To err is human, and robotic — the problem with self-driving cars

The problem with so-called self-driving cars is that they are not really self-driving - not yet.
Tesla vehicles equipped with Autopilot, a semiautonomous driving system, and cars being tested by Uber have many features that allow them to operate without much help from a human driver, but still require someone behind the wheel to react if something goes wrong.
Unfortunately, humans get awfully bored trying to catch a mostly functional machine not working - it's like watching a clock all day, according to safety experts. Fatal crashes in the last month in Arizona and California put a spotlight on this problem.
The idea of having a human monitor a machine driving, rather than a machine monitoring a human driving, gets it backward in terms of safety, said Jim McPherson, an attorney who studies autonomous vehicles.
"The better the automation gets, the more humans are lulled into a state of security and fail to catch (problems)," McPherson said.
Deborah Hersman, president and CEO of the National Safety Council, a safety advocacy group, said automation eventually can be a huge "game changer" for safety, since human error causes more than 90 percent of crashes. Car crashes caused 37,461 deaths in 2016, according to the National Highway Traffic Safety Administration.
But the transition from partial to full automation will be "challenging," Hersman said.
"The driver is still the car's best safety feature," she said. "They have to pay attention."
Video from the March 18 crash in Tempe, Ariz., shows the Uber test driver looking down and not at the road just before the car strikes and kills Elaine Herzberg, 49, who was walking a bicycle across a street. The official cause of the crash has not yet been determined, according to Tempe police. Uber said that it is cooperating with investigators.
Arizona Gov. Doug Ducey has ordered Uber to stop testing its autonomous vehicles on the state's roads.
On March 23, a Tesla slammed into a highway barrier on U.S. 101 in California, killing the driver, Walter Huang, 38. In a blog post, Tesla attributed the severity of the crash to the fact that a barrier designed to reduce the impact into a concrete lane divider had been crushed in a prior crash and not replaced. Tesla also said that the driver's hands were not detected on the wheel for six seconds prior to the collision, and that he had received several visual and one audible hands-on warning earlier in the drive.
The first known fatal crash of a self-driving car occurred in May 2016 in Florida, killing Tesla driver Joshua Brown.
Despite these tragedies, Tesla noted in its blog post that its vehicles equipped with Autopilot are still safer than regular cars, being 3.7 times less likely to be involved in a fatal crash.
a close up of a sign: U.S. District Judge William Alsup has slapped restrictions on ride-hailing giant Uber's driverless car research in a trade secrets civil lawsuit filed by archfoe Waymo, Google's autonomous car project.© Jaap Arriens/NurPhoto/Sipa USA/TNS U.S. District Judge William Alsup has slapped restrictions on ride-hailing giant Uber's driverless car research in a trade secrets civil lawsuit filed by archfoe Waymo, Google's autonomous car project.
Self-driving cars work with the aid of GPS, internal navigational maps, sensors, cameras, lidar and radar to navigate streets and detect objects in the environment. Transportation experts expect that fully self-driving cars could eventually lead to vastly safer roads, since unlike humans, machines cannot get drunk, text while driving or get sleepy.
SAE International, a professional group for automotive engineers, has set standards for the various levels of automation, which range from 0, in which a human controls everything, to a 5, a car that can behave like a human driver even in extreme environments like dirt roads.
Autonomous cars are being designed and tested by Volvo, GM, Audi, Toyota, Waymo and other companies. But cars on the market now are not fully autonomous.
The airline industry has already seen the pitfalls of too much automation, and automakers are making many of the same mistakes, according to Christine Negroni, an aviation writer and the author of the book "The Crash Detectives: Investigating the World's Most Mysterious Air Disasters." Several fatal airline crashes have been blamed on the phenomenon of pilots mentally disengaging from a plane that is flying itself, Negroni said.
"The farther you get away from the task, the longer it takes to get back into the task when seconds count," Negroni said. She spoke of a friend with a Tesla who believes that his driving skills were deteriorating because of Autopilot.
Negroni emphasized this is not the case of people being "bad," but being human. "One of the great things about humans is they're constantly looking for stimulation," she said. "You can't give humans a task they are ill-equipped to do and expect them to do it."
Alex Roy, a founder of the Human Driving Association, a safety lobbying group, and editor of the online automotive news site The Drive, agreed that overly trusting semiautonomous systems can lead to skill atrophy. He favors what he calls a "parallel" or "guardian" system, in which the driver is in charge and has his or her hands always on the wheel, but the car would help prevent crashes. He explains that this augments drivers' abilities, while protecting their freedoms.
This is similar to what's being done in the railroad industry with positive train control, or PTC, a federally mandated safety system that uses GPS, radios and computers to stop trains that are speeding or are in danger of colliding or derailing. With PTC, the engineer is still in charge of driving the train, but if the engineer does not stop or slow down when he should - because he falls asleep or has a heart attack - PTC will take over to stop the train, explained Bruce Marcheschi, Metra's chief engineering officer.
Roy also favors driver education and driver monitoring, to make sure the driver's attention stays on task.
"I don't believe that anyone should be testing self-driving vehicles on public roads without a driver monitoring system," said Roy, a rally race driver.
A driver monitoring system could be as simple as a second person in the car - Uber used to employ two testers instead of one, Roy said. Another method is some kind of electronic system to monitor the driver's attention. An example is the GM Cruise, which tracks a driver's eye movements and head position, said Ben Pierce, a transportation technology expert who spoke last week on a Metropolitan Planning Council panel on autonomous cars.
"If you quit paying attention, they'll start beeping and warning you, 'pay attention, pay attention'," said Pierce. "If you ignore all of that, it will pull itself over to the side of the road. That's a good way that the technology is starting to mature and catch up and will become about the technology helping us humans be safe, rather than stretching the boundaries of technology."
___
ABOUT THE WRITER
Mary Wisniewski is the transportation reporter and columnist for the Chicago Tribune.
Visit the Chicago Tribune at www.chicagotribune.com

No comments:

Post a Comment