A 40-year-old man from Ohio has been killed when his Tesla Model S collided with a large truck while in autopilot mode. Joshua Brown is the first known fatality involving Tesla’s self-driving technology and the US National Highway Traffic Safety Administration are currently investigating the circumstances of the crash.
A blog on Tesla’s website says that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across a junction in front of it.
“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”
Autopilot is a feature of Tesla’s Model S and SUV Model X and comprises adaptive cruise control and lane departure assistance. Tesla call it a public beta.
The cruise control system uses radar and forward-facing cameras to track the car in front and adjusts speed accordingly. The lane departure assistance system uses cameras to track road markings to keep the vehicle in its lane. Another feature is Autosteer, where the driver indicates manually and the car itself changes lane, using its sensors to avoid hitting anything.
Tesla say this is the first known fatality in just over 130 million miles where Autopilot has been activated and they contrast this with the rate for all vehicles in the US – a fatality every 94 million miles.
When Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” If hands are not detected on the wheel, it sounds warnings and gradually slows the car until they are detected again.
Despite this, the Guardian reports that there is a trend for Tesla owners to post self-driving videos, including one particularly striking example (shot on a private road) in which the driver is in the back seat.
Add new comment
7 comments
So Tesla's autonomous mode is now only a little better than the general population including drunk and other intoxicated drivers, sleepy drivers, distracted drivers etc. And since it is these drunks etc that die in crashes, then the Tesla car by extension is as bad as the drivers that driver so badly as to get themselves killed.
Still, it's early days and one crash does not make a statistic and the system will improve. But it's got a long way to go, so far it can only handle motorways that are generally free of pedestrians, cycylists, animals and perpendicularly moving white trucks
The tesla doesnt use the camera in the rear view mirror housing to detect objects, it only uses the radar in the front grille. This means anything above 4ft in height might not be detected, like a lorry trailer.
If they used the camera to detect things it might not have happened, but then again the camera might struggle to see a bright white lorry if it is sunny.
I think that it is clever technology, and a step in the right direction, but it needs refinement and improving before people can start taking their hands off the wheel.
And there's the problem: self driving cars, even first generation ones like this tesla, are quietly marketed, and understood by the general public, as if you're David Hasselhoff and your car is KITT. Which is a long way from the reality, IMO.
https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter
It looks like it may well be driver error, while clearly the system should of spotted the lorry, the driver if it's true shouldn't be watching a film.
Tesla expects drivers to be ready to intervene and take over from the computer if they have to. The Grauniad said in their comments that it would be safer to have computers running backup to the humans rather than expecting humans to be the backups to the computer- given that many drivers don't pay proper attention in *manual* cars...
There were times that I could barely stay awake driving on cruise control on the motorway because there was so little to do apart from pointing the car in the right direction. Tesla's official position is that drivers should keep their hands on the wheel and foot on brake while in autopilot, but it just seems inevitable that if they automate even the braking and steering, people are not going to be ABLE to pay attention for any long stretch of time. If you don't believe me, try doing nothing for two hours except staring outside the window and counting every green car that drives past. Your mind WILL wander after a while. That said, their instruction to drivers probably has a lot to do with reducing the company's liabilities in cases like this though.
I still think they'll be safer for cyclists than drivers though. Unless they programme in road rage.