Tesla Drivers Are Teaching Tesla Autopilot — That’s A Good Thing

Tesla owners are helping Tesla develop its Autopilot and eventually Fully Self Driving software. Despite being hailed as a bad thing by one automotive writer, this actually is a good thing.

An article published by Forbes calls Tesla and those who use Autopilot “reckless” while accusing Tesla of pushing a product that has pushed out a “safety critical feature that is not fully developed, tested and validated.” He claims that Tesla is “guilty of negligence.” While the author seems very upset at the potential safety hazard, let us remember that people driving with Autopilot on are much less likely to get into an accident than those driving without it on — whether in a Tesla or another vehicle.

The issue that the author is concerned with is Tesla’s latest update enabling Autopilot to recognize stoplights and stop signs and then autostop. He fears that this beta or unproven technology could put lives at risk due to human nature — people get too comfortable with Autopilot and end up using it irresponsibly.

CleanTechnica Director & CEO Zach Shahan, who also got the new update, added, “When cruise control came out, people were afraid it wasn’t safe. Some people today still misuse cruise control and cause accidents, but the net benefits outweigh those risks/costs. Tesla’s new Autopilot features, like the ones they are built on top of, offer greater safety if used correctly. Yes, they require the driver’s attention. Good — we should pay attention to what we’re doing. At the same time, they make sure you don’t zone out and run a red light or stop sign. That’s a safety boost. At the moment, Tesla is taking extra caution by having the car assume every light might be a red light and requiring the driver to interact at every intersection. It increases driver engagement at the moment (the opposite of what these same critics say is dangerous about previous-gen Autopilot), and will learn from the feedback it receives in order to improve the system and get considerably better than humans at responding to lights in the appropriate manner.”

In order for the AI to learn, it has to be taught. It can not learn in an environment that doesn’t provide ample learning opportunities. For example, if you are told fire hurts, but have never experienced pain, would it matter to you if it hurt or not? You’d be curious to understand what “hurt” means and would later regret finding out. Sofiaan Fraval, host/editor/founder of the Third Row Tesla Podcast, received the latest Tesla Software update for his wife’s Tesla Model S P100D. He told me that, “Excitedly, I immediately took the car out for a drive to test without telling her LOL.” He wasn’t disappointed by the new stoplight/stop sign features. “In fact, this new software feature marks a major milestone in Tesla’s history!”

Sofiaan explains that “The software has been in Early Access for a short while, and we’ve known for a while that Elon has been personally testing this on his vehicles since the shareholder meeting last year (and possibly earlier since he had that woman in the car he interviewed for a network).”

Impressed by the new update, Sofiaan wanted to share his thoughts on the video of the first time the car stopped at a light with no car in front of him. “It was truly remarkable just how smooth the vehicle stopped for the red light on its own, almost like I was driving myself, but I wasn’t even pressing the pedals or putting any pressure on the steering wheel. It just worked right out of the box. Incredible!”

His first Tesla was a 2013 P85 Model S which was one of the very first couple of thousands ever made. It didn’t have Autopilot nor did it have self-parking features. However, Sofiaan loved that car. “I loved that car so much, just for the drivability of it and the precision of speed control. No other car I had ever driven compared to the premium driving experience I had in the Tesla. It was a dream come true!”

His next Tesla vehicle had the first version of Autopilot’s hardware and it worked really well. “It’s amazing AP1 worked so well, with just one camera. People don’t realize how sophisticated the system is now and how far Tesla has come in a very short time. This is due to their fleet network, processing real-world driving data from actual customers on the road. Training the AI to behave accurately especially in weird situations that actually happen. Nobody else is so far ahead as Tesla when it comes to this.”

True, people abuse the Autopilot system, but that’s a given in any type of system — people will inappropriately use any tech. Tesla isn’t the only auto manufacturer that has had customers abuse its systems with nag hacks or other types of ways to create an “Autopilot.” Volkswagen owners have been able to make their cars “self-driving” by tricking the active lane assist with a bottle of water, for example. Zach Shahan adds, “There’s only so much you can say that better tech needs disclosures and warnings. If it’s better and can improve safety, it should then be able to go forward, even though we know some idiots in the world will push it beyond its limits.”

The truth is that when you have a new technology that is continuously improving, many will be fearful. However, those same fearful ones don’t take into consideration the many accidents that happen every day. Perhaps it’s a fear of loss of control. Many drivers don’t want to let go of their control and let the cars drive them — that fear is normal and perhaps this is why people fear the tech.

Photos © Zach Shahan/CleanTechnica

Leave a Reply

Your email address will not be published. Required fields are marked *