Tesla driver pays dearly for epic fail
TESLA'S controversial "autopilot" function is back in the news after a driver ran into the back of a police car in the US state of Connecticut.
A man driving a Tesla Model 3 hit the police car and a second car that was broken down on the Interstate 95 Northbound in Norwalk.
Connecticut police said the officers, who were waiting for a tow truck to arrive, had their emergency lights activated and "an additional flare pattern" behind the car to alert drivers.
Police said the man driving the Tesla told them he had "autopilot" engaged and was checking on his dog in the back seat when the collision occurred.
No one was seriously injured but the driver was given a ticket for reckless driving and reckless endangerment.
On the force's Facebook page, police said the incident "could have been more severe" and reminded drivers that "regardless of your vehicle's capabilities, when operating a vehicle your full attention is required at all times to ensure safe driving".
Tesla has long been criticised for using the term "autopilot" to describe driver assistance software that allows the vehicle to accelerate, brake and change lanes automatically.
Safety experts claim the term gives drivers a false sense of security and encourages them to test its limits.
Many drivers have posted videos of "hands off" driving and at least three fatal accidents have occurred while autopilot was engaged.
Tesla itself makes it clear that the driver needs to remain in control at all times behind the wheel.
"Autopilot is an advanced driver assistance system that is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time," a spokesperson for the maker says.
Nevertheless, Tesla founder Elon Musk has been criticised for being too bullish about autonomous driving tech.
Earlier this year he claimed cars would be so good at driving themselves drivers would be able to fall asleep at the wheel.
Other brands argue that we are at least a decade away from truly autonomous vehicles and that tech shouldn't be released until it's foolproof.
In the US, Tesla has been involved in lawsuits from families of people who have died while using Autopilot.
The US National Transportation Safety Board has also voiced concerns, suggesting that Autopilot could be lulling some drivers into a false sense of security.