Car manufacturers are pushing autonomous technologies. This engineer wants limits.

0
25

Last fall, Missy Cummings sent a document to her colleagues at the National Highway Traffic Safety Administration that revealed a surprising trend: when people using advanced driver assistance systems die or are injured in a car accident, they are more likely to be speeding than people who are alone Drive.

The two-page analysis of nearly 400 accidents involving systems like Tesla’s Autopilot and General Motors’ Super Cruise is far from conclusive. But it raises new questions about the technologies installed in hundreds of thousands of cars on US roads. dr Cummings said the data suggested drivers are over-reliing on the systems’ capabilities and that automakers and regulators should limit when and how the technology is used.

People “trust technology too much,” she said. “They make the cars race. And they get into accidents where they are seriously injured or killed.”

dr Cummings, an engineering and computer science professor at George Mason University who specializes in autonomous systems, recently returned to academia after more than a year with the agency. On Wednesday, she will present some of her findings at the University of Michigan, a short drive from Detroit, the main center of the US auto industry.

Systems like Autopilot and Super Cruise, which can steer, brake and accelerate vehicles independently, are becoming more common as automakers compete for car buyers with promises of superior technology. Companies sometimes market these systems as if they are making cars autonomous. But their legal fine print requires drivers to remain alert and ready to take control of the vehicle at all times.

In interviews last week, Dr. Cummings said automakers and regulators should prevent such systems from exceeding the speed limit and require drivers to keep their hands on the wheel and their eyes on the road.

“Car companies — so Tesla and others — are marketing this as hands-free technology,” she said. “This is a nightmare.”

However, these are not actions that NHTSA can easily implement. Any attempt to limit the use of driver-assistance systems is likely to face criticism and lawsuits from the auto industry, particularly from Tesla and its CEO Elon Musk, who has long fretted over rules he believes are antiquated.

Security experts also said the agency was chronically underfunded and lacked enough qualified staff to adequately do its job. The agency has also operated without a Senate-approved permanent head for most of the past six years.

dr Cummings acknowledged that implementing the rules she asked for would be difficult. She said she also knew her comments could reignite supporters of Mr Musk and Tesla, who attacked her on social media and sent her death threats after she was named the agency’s senior adviser.

But dr Cummings, 56, one of the Navy’s first female fighter pilots, said she felt compelled to speak out because “the technology is being misused by people.”

“We need to make regulations that deal with that,” she said.

The safety agency and Tesla did not respond to requests for comment. GM cited studies it had conducted with the University of Michigan examining the safety of its technology.

Because Autopilot and other similar systems allow the driver to relinquish active control of the car, many safety experts fear the technology will trick people into believing the cars are driving themselves. When technology fails or cannot handle situations such as For example, drivers may be unprepared to take control quickly enough, such as taking quick swerves to avoid stuck vehicles.

The systems use cameras and other sensors to check whether the driver’s hands are on the steering wheel and his eyes are on the road. And they switch off if the driver is not attentive for a long time. But they work for routes when the driver isn’t focused on driving.

dr Cummings has long warned that this can be a problem — in academic papers, in interviews, and on social media. She was appointed senior safety advisor to NHTSA in October 2021, shortly after the agency began Collection of accident data involving cars using driver assistance systems.

Mr. Musk responded to her appointment in a post on twitter, who accused her of being “extremely biased against Tesla” without providing any evidence. This sparked an avalanche of similar comments from his supporters on social media and in emails to Dr. Cummings out.

She said she eventually had to shut down her Twitter account and temporarily leave her home because of harassment and death threats at the time. A threat was serious enough to be investigated by police in Durham, NC, where she lived.

Many of the claims were nonsensical and false. Some of Mr Musk’s supporters noted that she was a board member of Veoneer, a Swedish company that sells sensors to Tesla and other automakers, but confused the company with Velodyne, a US company whose laser sensor technology – called lidar – is seen as a competitor to the sensors Tesla uses for Autopilot.

“We know you own lidar companies and if you accept the NHTSA consultant position we will kill you and your family,” an email sent to them said.

Jennifer Homendy, who heads the National Transportation Safety Board, the agency that investigates serious car accidents and which has also come under attack from fans of Mr Musk, said CNN Business in 2021 that the false claims about Dr. Cummings were a “calculated attempt to distract from the real security issues.”

Before joining NHTSA, Dr. Cummings took over the board of Veoneer, sold her shares in the company and withdrew from the agency’s investigations, which only concerned Tesla, including one was announced prior to their arrival.

The analysis, which she sent to government officials in the fall, looked at advanced driver assistance systems from several companies, including Tesla, GM and Ford Motor. When cars with these systems were involved in fatal accidents, they were driving over the speed limit 50 percent of the time. In accidents with serious injuries, they drove too fast in 42 percent of the cases.

For accidents without the involvement of driver assistance systems, the figures were 29 percent and 13 percent.

The amount of data the government has collected on crashes involving these systems is still relatively small. Other factors could skew the results.

Driver assistance systems are used far more frequently on freeways than in cities, for example. And that of Dr. Cummings analyzed crash data is dominated by Tesla because its systems are more widely used than others. This could mean that the results unfairly reflect the performance of other companies’ systems.

During her time at the Federal Office for Security, she also examined so-called phantom braking, when driver assistance systems cause cars to slow down or stop for no apparent reason. Last month, for example, news site The Intercept released footage a Tesla vehicle that inexplicably braked in the middle of the Bay Bridge connecting San Francisco and Oakland, causing an eight-car pile-up injuring nine people, including a two-year-old boy.

dr Cummings said data from automakers and customer complaints showed this was an issue with several driver assistance systems and with robotic taxis used by companies like Waymo, which is owned by Google’s parent company, and Cruise, a division of GM, which is now being tested in several cities are developed , these self-driving taxis are designed to drive without a driver, and they are carry passengers in San Francisco and in the Phoenix area.

Many accidents appear to occur because people behind these cars are not prepared for these unpredictable stops. “The cars brake in ways that people cannot predict and cannot react to,” she said.

Waymo and Cruise declined to comment.

dr Cummings said the federal safety agency should work with automakers to restrict advanced driver-assistance systems using their standard recall process, where companies agree to make changes voluntarily.

But experts doubted automakers would make such changes without significant struggle.

The agency could also create new rules explicitly governing the use of these systems, but that would take years and could lead to lawsuits.

“NHTSA could do that, but would the courts uphold that?” said Matthew Wansley, a professor at Yeshiva University’s Cardozo School of Law in New York, who specializes in emerging automotive technologies.

dr Cummings said robotaxis have arrived at about the right pace: after limited testing, federal, state and local regulators are keeping their growth under wraps until the technology is better understood.

But, she said, the government needs to do more to ensure the safety of advanced driver assistance systems like Autopilot and Super Cruise.

NHTSA “needs to flex its muscles more,” she said. “It must not be afraid of Elon or market moves where there is a manifestly unreasonable risk.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here