Many Americans feel comfortable treating their advanced driver assistance systems (ADAS), which partially automate certain driving functions, as full self-driving systems, according to a study from the Insurance Institute for Highway Safety (IIHS).
The survey explored habits, expectations and attitudes among regular users of General Motors Super Cruise, Nissan/Infiniti ProPILOT Assist and Tesla Autopilot. A total of 604 participants spread more or less evenly across the different brands took part in the survey from January to November 2021. All three groups were found to be more likely to engage in non-driving related activities — like texting or eating — while using their systems than when driving manually.
That was especially true for Super Cruise and Autopilot users, who were more likely to report performing activities that would take their hands off the wheel and eyes off the road. Super Cruise and Autopilot users also said they could perform these types of tasks better and more often while using their systems, according to the study.
A total of 53% of Super Cruise users, 42% of Autopilot users and 12% of ProPILOT Assist users said they were comfortable treating their systems as self-driving.
The study’s publishing follows a series of incidents involving the safety of Tesla’s Autopilot system, and by extension, its “Full Self-Driving” (FSD) system, the company’s more advanced ADAS. Last month, some Tesla drivers filed suit against the company for falsely advertising the autonomous capabilities of Autopilot and FSD, something California’s Department of Motor Vehicles has also recently accused Tesla of.
In August, the National Highway Traffic Safety Administration (NHTSA) asked Tesla to provide more information about its cabin camera — which is meant to monitor the alertness of drivers using Autopilot and FSD — as part of its ongoing probe into 830,000 Teslas that include Autopilot. NHTSA is currently investigating 16 crashes in which Tesla owners had potentially engaged such systems before crashing into stationary emergency vehicles.
GM’s Super Cruise, by comparison, has not shown up in NHTSA’s database of investigations. That said, NHTSA has only been asking manufacturers of automated driving systems and ADAS to report crashes to the agency since last year. Super Cruise first appeared on vehicles in 2017, so it’s possible crashes occurred over that four-year period. GM did not respond to TechCrunch’s request for information about crashes, nor did it share the number of vehicles on the road with Super Cruise enabled. However, it will be far fewer than the number of vehicles with Tesla’s Autopilot, which comes standard on all new Teslas.
Super Cruise’s safeguards have also been described as more robust than Tesla’s. Consumer Reports issued ratings for ADAS on certain vehicles earlier this year, and found that Super Cruise and Ford’s Blue Cruise were the only automakers to receive two extra points for having systems that encourage safe driving. During CR’s tests of different GM vehicles, the agency said that each delivered “multiple warnings to grab an inattentive driver’s attention.”
“If the driver still does not react, the system will start to slow the car down on its own, eventually bringing it to a stop. The system won’t operate if the camera is covered,” according to a statement from CR.
With Tesla Model Y and S vehicles with software version 11.0, drivers could use Autopilot even with the vehicle’s cabin camera fully covered, according to CR. If the camera detected the driver’s eyes were off the road, it would shorten how long the driver could take their hands off the wheel. However, as long as the driver’s hands remained on the wheel, CR found no warnings if the eyes were off the road.
The IIHS survey found that some drivers in the survey said such user safeguards, like attention reminders and lockouts, were annoying and they would try to circumvent them. However, most people said they found those safeguards helpful and felt safer with them.
The study suggests that driver monitoring systems and “multifaceted, proactive user-centric safeguards” are key to shaping proper behavior and understanding about drivers’ roles while using partial driving automation.
“Some regular users have a poor understanding of their technology’s limits,” the study reads. “System design appears to contribute to user perceptions and behavior.”
Many Americans treat driver assist systems like self-driving by Rebecca Bellan originally published on TechCrunch