The investigation was inspired by reports that Teslas using the electric car maker’s advanced driver-assist feature, Autopilot, collided with about 16 stationary emergency vehicles between 2018 and 2021, according to Ben Nassi, a cybersecurity researcher and machine learning from Ben-Gurion University who worked on the paper. “We were pretty clear from the beginning that the crashes could be related to the illumination of the emergency flashers,” Nassi says. “Ambulances, police cruisers and fire trucks come in different shapes and sizes, so it is not the type of vehicle that causes this behavior.”
A three-year investigation by the US National Highway Traffic Safety Administration into collisions between Tesla and emergency vehicles eventually led to a broad withdrawal from software Tesla Autopilotwhich is designed to perform some driving tasks (such as steering, accelerating, braking, and changing lanes on certain types of roads) without the assistance of a driver. The agency concluded that the system did not adequately ensure that drivers paid attention and were in control of their vehicles while the system was activated. (Advanced driver-assist packages from other manufacturers, such as General Motors’ Super Cruise and Ford’s BlueCruise, also perform some driving tasks, but require the driver to pay attention to the wheel. Unlike Autopilot, these systems They work only in areas that have been mapped).
Some confirm, Tesla does not
In a written statement sent in response to questions from WIRED, NHTSA spokesperson Lucía Sánchez acknowledged that emergency flashers may play a role: “We are aware of some advanced driver assistance systems that have not responded appropriately when “emergency flashers were present at the scene of the driving path under certain circumstances,” Sánchez wrote.
Tesla, which disbanded its public relations team in 2021, did not respond to WIRED’s request for comment. The camera systems the researchers used in their tests were manufactured by HP, Pelsee, Azdome, Imagebon and Rexing; Neither of those companies responded to WIRED’s requests for comment.
Although NHTSA acknowledges problems in “some advanced driver assistance systems,” researchers are clear: They are not sure what this observed effect of emergency lights has to do with Tesla’s Autopilot problems. “I don’t pretend to know why Teslas crash into emergency vehicles,” Nassi says. “I don’t even know if this is still a vulnerability.”
The researchers’ experiments also concerned only image-based object detection. Many automakers use other sensors, such as radar and lidar, to detect obstacles on the road. A smaller group of technology developers (Tesla, among them) maintain that image-based systems, together with sophisticated training in artificial intelligence, can enable not only driver assistance systems, but also fully autonomous vehicles. Last month, Tesla CEO Elon Musk claimed that the automaker’s vision system would enable autonomous driving next year.
#Emergency #lights #digital #seizures #automated #driving #systems