Consumer Reports recently “found” that a Tesla can drive with Autopilot enabled without a driver. I have some thoughts on these findings. Quick disclaimer: I am a Tesla shareholder — I own 4 shares — and I also believe that Tesla is highly misrepresented in the mainstream media. Here are my thoughts.
Did you know that if you slip in some type of device that will prevent a seatbelt from latching, it won’t latch? The same can be said for Tesla’s Autopilot.
This isn’t proving anything new except just how far Consumer Reports and other outlets will go to spread FUD against Tesla. If you’re not familiar with that term, FUD means fear, uncertainty, and doubt. It’s not direct lies, but it’s the presentation of a topic in a way that will dissuade people from supporting or agreeing with that topic.
How Consumer Reports Got The Tesla To Operate Without A Driver
During the test, the vehicle operator drove the car until it reached 15 miles per hour. Next, he engaged Tesla’s Autopilot feature and then changed the set speed down to zero, which stopped the car. Keep in mind, Autopilot was still engaged even though the vehicle was at a complete stop at zero.
Next, he put on a small weight, which is a common nag hack used by those abusing Autopilot and other driver assists programs (yes, it happens with other vehicles, too). The weight was small enough to make the car think there was a hand on the wheel. Next, he moved from the driver’s seat to the passenger side without unclipping the seatbelt. Autopilot remained engaged throughout his move.
Next, he put his foot on the accelerator and increased the set speed by using the little wheel on the steering wheel. Of course, the vehicle will think there’s a driver in the seat in this scenario. The seatbelt wasn’t disconnected and there were weights on the wheel. The person testing the vehicle said that there were no warnings that there was no one sitting in the seat. Clearly, you can see that the vehicle was tricked into thinking that a driver was in the seat.
What Does This Actually Show?
I have both positive and negative thoughts about this. The positive is that Tesla will, as it always does, innovate to solve this problem. Tesla is great about meeting challenges and this is something that the engineers will aim to solve. However, I really think that Consumer Reports cheated to make Tesla seem unsafe for its feature especially when Tesla isn’t the only automaker with a type of driver’s assist that can be hacked in this manner.
Also, by making a very public how-to video about this, Consumer Reports just literally taught people how to trick their cars into thinking there is a driver in the seat. This is very dangerous. Tesla literally has it on its website when purchasing FSD that drivers need to remain aware at all times. Here’s a screenshot for proof.
My Challenge To Consumer Reports
Redo the test. Do it on the same track, but this time, unclick the seatbelt and don’t add weights to the steering wheel. And in another video, demonstrate what Tesla vehicles do when Autopilot realizes that there is no driver or that a driver is ignoring the nag hack.
According to Consumer Reports, it “works side by side with consumers for truth, transparency, and fairness in the marketplace.” If this statement was true, Consumer Reports would publish a piece on Tesla’s Autopilot nag hacks and how it would respond if it knew there wasn’t a driver in the seat.
Also, if you’re going to trick driver-assist software in vehicles, you should do this for every carmaker that has them instead of singling out Tesla.