The third major Tesla autopilot crash is already upon us, after a Tesla Model S in autopilot mode rear-ended a parked police car. At this point, Either one of two things is happening, either Tesla’s autopilot AI has developed suicidal tendencies or that people really, really trust Tesla’s autopilot mode. There is good reason for the surprise, in 2018, knowing fully well that Tesla in autopilot mode has a tendency of crashing, I’d have avoided that autopilot button. Luckily, the parked police car was empty ensuring that no one was hurt in the process, of course not including the Laguna Bay Police car. It was totalled in the impact. “Thankfully there was not an officer at the time in the police car,” said Laguna Police Sgt. Jim Cota to Forbes. “The police car is totalled.”
The previous incident involving a Tesla Model S in the state of Utah, although according to several reports on the internet, this particular Model S isn’t even the first major Tesla crash in Laguna bay. The same police officer who was quoted in the crash mention a Tesla had run into a semi-truck last year in about the same spot. Begging the question as to why this keeps happening.
Following the incident, a Tesla spokesperson spoke to the LA Times via email saying that “Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and before a driver can use Autopilot, they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings,’” a Tesla spokesperson said in the emailed statement.
Furthermore, Tesla’s Model S owner’s manual warns drivers that some Autopilot functions will not be able to "detect all objects and may not brake/decelerate for stationary vehicles or objects especially when traveling over 50 mph” as well as when a car ahead “moves out of your driving path and a stationary vehicle or object is in front of you.”
Although that does beg the question as to why despite all these incidents Tesla is still standing by their autopilot technology. Even if the accidents are primarily being caused by the human operators. The sample size for its failures are way too high, and the responsible thing to do for Tesla would be to block the autopilot feature until a better solution is found.
Image Source: California Police