Elon Musk, the founder of electric automobile and storage company Tesla Motors, said that it would be "morally reprehensible" to delay the release of partially autonomous vehicles.
The PayPal and solar power company SolarCity founder released his "Master Plan, Part Deux" this week which laid out his plans for the coming decade in regards to the growth of electric and autonomous vehicles.
In the statement he talked about the spread of autonomous vehicle technology across the entire Tesla range, how safe autonomous technology - even at its partial level available to current Tesla owners - is, and how it would be outrageous not to implement these technologies in fear of bad press of legal liability.
Given that the first road death in a Tesla while it was in semi-autonomous autopilot mode occurred in May, Musk's comments are particularly poignant.
It later became apparent that the driver was not abiding by Tesla's guidelines for operating a vehicle in semi-autonomous mode, nonetheless, question marks were raised over the safety of such vehicles - especially for vulnerable road users.
Therefore, Musk's comments that all Teslas will soon have the capacity to be fully self-driving could be worrying for some road users.
He said: "All Tesla vehicles will have the hardware necessary to be fully self-driving with fail-operational capability, meaning that any given system in the car could break and your car will still drive itself safely."
Despite those fears, Musk appears to back autonomous technology to the hilt while defending what appears to be a system in its infancy.
"It is important to emphasize that refinement and validation of the software will take much longer than putting in place the cameras, radar, sonar and computing hardware," he wrote.
"Even once the software is highly refined and far better than the average human driver, there will still be a significant time gap, varying widely by jurisdiction, before true self-driving is approved by regulators."
He suggested that it will take somewhere within the realm of 6 billion miles of testing before regulatory bodies are ready to accept the technology. Current testing is happening at just over 3 million miles per day.
Those regulatory restrictions don't limit the use of partial autonomy though, which is why it was possible for a driver to have died while using the autopilot feature.
Even in the face of such a tragedy, Musk remained that it would be "morally reprehensible" not to implement the technology.
"The most important reason [for deploying partial autonomy now is] that, when used correctly, it is already significantly safer than a person driving by themselves," he wrote."
"It would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability."
Musk goes on to say that according to te National Highway Traffic Safety Administration's 2015 report, deaths involving on-road vehicles increased by 8% - to one every 89 million miles. Musk claims that his autopilot technology will soon exceed twice that number, and the system is getting better every day.
He goes on to say it would make as much sense to turn off Tesla Autopilot as "it would to disable autopilot in aircraft, after which our system is named."
In regards to cyclists, Tesla has been far less vocal than other autonomous vehicle researchers.
Google's autonomous vehicle concept was designed with vulnerable road user safety in mind - with foam bumpers to soften any unlikely impacts - while they've also patented sticky bumper technology to reduce the chance of extra impacts.
Google has also been vocal about its software which has been programmed to pay special attention to cyclists, and to recognise commonly used hand signals.
Meanwhile Renault's chief executive took a swipe at cyclists, calling them "one of the biggest problems" for driverless cars.
Tesla's roadmap of an autonomous vehicle future may be light details that will reassure vulnerable road users, but at least the American company isn't actively antagonising them.
Add new comment
34 comments
The sooner, the better. The cars just have to be slightly safer than humans for it to be beneficial and as most incidents are caused by lack of attention, I reckon they're safer already.
Good link, will have a shot at that during my weekly procrastinations
BTW agree with Musk, if autonomous has a greater safety record per mile than driven, then yes, it's morally wrong to delay.
You might want to follow this link, and make your own observations on/contributions to the consultation paper. Some of it, and the discussions therein, make interesting, and enlightening, reading. https://www.gov.uk/government/news/new-measures-to-help-britain-lead-the...
He's right. I am quite happy to leave my safety in the hands of a computer if the only alternative is to trust a human. The average driver, round here at least, is barely competent and often finds difficulty in simple manoeuvres such as overtaking a cyclist.
Bring it on.
Pages