The program itself is in beta testing and was released this past October. So far there have been no fatalities from accidents – until this week. A tractor trailer crossed a divided highway making a left turn, and for some reason the Tesla did not detect this obstacle and the car crashed into the trailer, with the body of the car going under the trailer and effectively slicing off the top of the car. It is truly a grizzly accident and there are many questions as to how it happened.
According to Tesla the car did not see the white trailer because of the bright white sky. However, what was the driver doing? It is still unclear. Unfortunately we may never know if it was driver error (was he truly paying attention or did he have his hands off the wheel while he was texting?) or the error of the car (was the driver expecting the car to stop? If so, why didn’t he intervene when it was clearly not slowing down?) or even the error of the truck driver (did the truck turn into oncoming traffic too quickly – as in, did he not see the oncoming Tesla?)
While these answers pan out there has been much discussion on how this will impact the current push towards driverless cars. Currently there are several different types of technologies that are being developed by various companies. Probably the two most known are Google and Tesla.
The makers of these new technologies assure that driverless cars and automated driving systems are safer than human counterparts. And so far, the numbers seem to add up – the current average fatality rate is 1 in 90 billion miles, while the Tesla currently has a 1 in 130 billion miles. However, because there has only been one fatality so far with Tesla the sample size is much too low. Others point out that the national average includes things like motorcycles, old cars, no seatbelts, and sidestreets (where most accidents happen anyway). If that is the case, the Tesla number may actually be WORSE than the national average of modern cars with seatbelts on highways and freeways (where the autopilot feature is designed to run).
However, driverless car proponents say that this accident may have been a fluke or the fault of the driver, and maintain that autopilot features are safer than humans.
With the Tesla software still in beta, however, we urge drivers to use extra caution when using the features.
I, for one, probably would not trust it. How do you feel about this new technology? Take some time to comment below.