Researchers of the Dawn Project, whose goal is to ban dangerous software in critical systems, have published a video of tests of Tesla’s proprietary autopilot. During races held at Willow Springs Raceway near Rosamond, California, a Tesla Model 3 sedan hit a child dummy three times.
At the same time, the Full Self-Driving (FSD) system was activated in the electric car in the latest version 10.12.2. The researchers also “dressed” the child in clothes of different colors: red, white and black.
This is a failure: Tesla on autopilot knocked down a child’s mannequin three times – now the function can be banned. Despite the fact that the races were held in clear, dry weather, and the car was moving at a speed of only 40 km/h, the electronics could not recognize the child on the road and brake in time.
Project founder Dan O’Dowd called the incident “very disturbing” and Tesla’s software “the worst commercial software he’s ever seen”. In his opinion, the authorities should ban the use of FSD until it starts working properly.
It must be said that Dan O’Dowd is a security engineer and part-time founder and CEO of Green Hills Software, an American private company that produces real-time operating systems and development tools for embedded systems. O’Dowd has created secure operating systems for a variety of projects, including the Boeing 787, Lockheed Martin’s F-35 fighter jet, the Boeing B1-B intercontinental nuclear bomber, and NASA’s Orion spacecraft.
Given the fact that 100,000 Tesla drivers are already using the FSD system on public roads. This poses a threat to others, including children. Of course, Tesla CEO Elon Musk disagrees with this. He calls his autopilot “awesome software”, but the authors of the Dawn Project disagree.