A ‘deeply disturbing’ video claims to show a Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group.
The Dawn Project said the vehicle failed to detect the stationary dummy’s presence in the road and hit it over and over again at an average speed of 25mph.
It claims that the experiment was carried out under ‘controlled conditions’ on a test track in California.
Tesla, which was founded by billionaire entrepreneur Elon Musk, has been approached for a comment by MailOnline but is yet to respond to the video.
The US National Highway Traffic Safety Administration (NHTSA) confirmed that it ‘currently has an open and active investigation of Tesla’s Autopilot active driver assistance system’.
A spokesperson said this included the full self driving software and that the agency ‘considers all relevant data and information that may assist its investigation’.
‘Deeply disturbing’: A video claims to show a Tesla in full self-driving mode running over a child-size mannequin during a test by a safety campaign group
The Dawn Project said the vehicle hit the stationary dummy over and over again at an average speed of 25mph and failed to detect its presence in the road
Tesla, which was founded by billionaire entrepreneur Elon Musk, has been approached by MailOnline but is yet to respond to the video
The Dawn Project released the video as part of its launch of a nationwide public safety ad campaign highlighting the dangers of Tesla’s full self-driving software.
‘The deeply disturbing results of our safety test of Tesla full self-driving should be a rallying cry to action,’ said Dan O’Dowd, founder of The Dawn Project.
‘Elon Musk says Tesla’s Full Self-Driving software is “amazing”. It’s not. It’s a lethal threat to all Americans.’
He added: ‘Over 100,000 Tesla drivers are already using the car’s full self-driving mode on public roads, putting children at great risk in communities across the country.’
The safety test was carried out at the Willow Springs International Raceway and test track in Rosamond, California on June 21.
It was recorded and the footage shows a Tesla in full self-driving mode repeatedly running over the child-sized mannequin.
An NHTSA spokesperson said: ‘NHTSA currently has an open and active investigation of Tesla’s autopilot active driver assistance system, including the “full self-driving” software.
‘As that engineering analysis is ongoing, NHTSA considers all relevant data and information that may assist the agency in its investigation.’
In February this year, Tesla recalled nearly 54,000 cars and SUVs because their full self-driving software was found to let them roll through stop signs without coming to a complete halt.
The over-the-internet software update, which was being tested by a number of drivers, allowed the vehicles to go through junctions with a stop sign at up to 5.6 miles per hour.
Documents shared by US safety regulators said Tesla had agreed to the recall after two meetings with officials from NHTSA.
Tesla said at the time that it was not aware of any crashes or injuries caused by the feature, and that there had been no warranty claims as a result of issues with the rolling start update.
The recall covered Model S sedans and X SUVs from 2016 to 2022, as well as 2017 to 2022 Model 3 sedans and 2020 to 2022 Model Y SUVs.
The Dawn Project released the video as part of its launch of a nationwide public safety ad campaign highlighting the dangers of Tesla’s full self-driving software
The US National Highway Traffic Safety Administration (NHTSA) confirmed that it ‘currently has an open and active investigation of Tesla’s Autopilot active driver assistance system’
The safety test was carried out at the Willow Springs International Raceway and test track in Rosamond, California on June 21
Beta testing of the full self-driving software was carried out by selected Tesla drivers.
However, the cars cannot drive themselves and drivers must be ready to take action at all times, the company says.
Safety advocates complain that Tesla should not be allowed to test the vehicles on public roads with untrained drivers, and that the Tesla software can malfunction, exposing other motorists and pedestrians to danger.
Most car companies with similar software test with trained human safety drivers.
In November last year, the NHTSA said it was looking into a complaint from a Tesla driver that the full self-driving software caused a crash.
It was recorded and the footage shows a Tesla in full self-driving mode repeatedly running over the child-sized mannequin
The driver said their Model Y went into the wrong lane and was hit by another vehicle. The SUV gave the driver an alert halfway through the turn and the driver tried to turn the wheel to avoid other traffic, according to the complaint.
But the car took control and ‘forced itself into the incorrect lane,’ the driver reported.
No one was hurt in the crash on November 3 in Brea, California, according to the complaint.
In December, Tesla agreed to update its less sophisticated autopilot driver-assist system after the NHTSA opened an investigation.
The company agreed to stop allowing video games to be played on centre touch screens while its vehicles are moving.
The agency is also investigating why Teslas on autopilot have repeatedly crashed into emergency vehicles parked on roads.