A Tesla Model S has been reported to have been involved in an accident when it rear-ended a parked firetruck on a freeway near Los Angeles on Monday, the US National Transportation Safety Board has confirmed. The US safety board will send two investigators to conduct a field probe of the accident near Culver City, California. The investigators will assess the driver’s actions and the activities performed by the vehicle. This is the second time a Tesla has been involved in an accident with its autopilot system engaged. The system uses advanced cruise control and automatic steering systems that can enable hands-free driving in limited situations. However, Tesla has maintained that the autopilot system is only to be used with the driver paying full attention. Hence, Tesla autopilot system will not engage if it does not sense the driver holding the steering wheel.
Tesla’s driver, involved in the crash, said that he had activated the vehicle’s autopilot system when it struck a firetruck at about 104 km/h, a union for Culver City firefighters said in a tweet on Monday. “Amazingly there were no injuries! Please stay alert while driving!” the union said in the tweet. According to Mercury News, the firetruck was parked at the emergency lane at the side of the highway attending to another accident.
Tesla said in a statement that its autopilot feature is “intended for use only with a fully attentive driver.” The company said it has taken steps to educate drivers about the need to keep their hands on the steering wheel and be prepared to take over from autopilot, which it calls an “advanced driver assistance system” that is not intended to turn the vehicle into an autonomous car.
The US safety board had previously said that Tesla’s advanced driver assistance system was a contributing factor in the crash that happened in 2016, killing the driver. It happened when a Model S drove underneath a semi-truck crossing the road that Tesla autopilot sensors failed to detect.
The National Highway Traffic Safety Administration in the US ruled that it was partially the fault of the driver and the that of the car in a case which made headlines as Tesla’s first fatal crash. Ohio resident Joshua Brown was killed in the highway crash in Florida. The NHTSA said that Brown had ignored several warnings from the car to take over and also that the car encouraged ‘period of extended distraction’
Also read: Jeremy Clarkson says driverless cars are dangerous: Here’s why the reason is convincing
So, there are two things to notice, our self-driving tech isn’t refined enough for the real world and we as humans aren’t prepared to rely on them yet. I mentioned earlier in this piece that Tesla autopilot system requires you to keep your hands on the steering wheel at all times.
The video above shows how a man tricked the system to believe that the driver is holding the steering wheel, just by wedging an orange on the wheel (since the system detected weight on the steering wheel). Although, the man in the video maintained that he did his experiment on an empty road, how long before the numpty ideas like these are shared across and we have people sitting behind the wheel watching Netflix or eating a bowl of noodles, jeopardising themselves and others on the road?
For regular updates on social media, please follow our Facebook and Twitter accounts. Subscribe to our YouTube channel for automotive videos.