A lot more details have surfaced regarding the Tesla Autopilot death that recently occurred in Florida. Unfortunately, they aren’t very positive for Tesla nor with regards to how the driver was apparently using Autopilot.
Aside from what Steve reports below, here’s a note from a friend of the deceased driver’s former coworkers:
I am very close to a few of his former EOD coworkers and they are all very, very sad.
This was sent to me on Monday by my good friend:
“A buddy of mine (Josh Brown — prior Navy EOD) had the same car with the auto pilot and loved it. He drove all over the country in his tending to his business and had all the recharge stations mapped out everywhere. It was cool and he posted a bunch of YouTube videos with one being picked up by Elon Musk and posted on the Tesla site. When that happened, Josh posted on Facebook: ‘I can die and go to heaven now…’ Unfortunately that statement was prophetic in that Josh died a couple of weeks later in a car accident. We’re pretty certain, he had it on auto-pilot while working on his laptop and didn’t see the semi that pulled out. He shot underneath it and it clipped his roof killing him instantly. Sorry for the downer but, he loved that car and swore by it. To say I was relieved to see you ‘didn’t’ get the auto pilot is an understatement.. As soon as I saw you had ordered a Tesla, my blood literally ran cold until I saw ‘no auto-pilot’. It is a great car and I feel like an old man in saying ‘please be careful in it young lady!!!’ Take care”
Be safe out there — regardless if you’re using auto-pilot or not.
Sad news, but also not a good sign that the driver was probably working on his computer while driving (with Autopilot on). The reports below indicate that he wasn’t working, but rather watching Harry Potter. Either way, it sounds like he probably wasn’t paying close attention to the road. If you have been injured in a car accident, Christensen & Hymas can help.
Furthermore, if the reports are correct (we don’t know for certain yet), he was apparently speeding.
All of these points bring up the concern that people (even former Navy SEALs) get complacent with Tesla Autopilot on. This is something that concerned me while testing out Autopilot for the first time. At first, I was very concerned about letting go and was still very attentive, but after a few minutes, even in thick and fast California traffic, I relaxed a lot, and while in conversation with others in the car, my mindset shifted more to that of a passenger. It then hit me that I wasn’t paying close attention to the road. If this happened to me after just a few minutes of driving on Autopilot, I imagine it really happens to people who own a car with Tesla Autopilot and start to trust it more than they should.
And then, there’s the really concerning bit about the technology itself. Apparently, even after having its roof torn off, the Tesla Model S kept driving, through fences and such, until crashing into a pole. Not the kind of thing that’s going to sell more Autopilot Teslas, get regulators to relax, or boost Tesla’s stock.
Naturally, as Tesla’s disclaimers say, Autopilot is in beta and drivers are supposed to remain alert and ready to take over at any time. Nonetheless, this is a rather chilling story.
Below is more information on the story from Steve Hanley at Gas2.
New Details About Fatal Tesla Crash Emerge
UPDATE July 2 9:00 am. According to Automotive News, “There was a portable DVD player in the vehicle,” said Sergeant Kim Montes of the FHP in a telephone interview.
The story about the man who died in a crash while driving his Tesla Model S in Autopilot mode on a highway in Florida is all over the news. When the story broke a few days ago, I first assumed it happened this week. I was surprised to learn later that it actually happened on May 7. I do this stuff for a living (sort of) and so I monitor news about Tesla and Elon Musk fairly closely. Until Wednesday, there was not a single report anywhere on the internet about a fatal crash involving a Tesla back in May. Then Tesla apparently finally found out about the crash and NHTSA got involved.
At first, details were sketchy. We heard that a tractor trailer was making a left turn at an intersection on a four-lane highway with a wide median strip. There were no traffic lights at the intersection. These sorts of road junctions are quite common in Florida and other parts of the country. Whether the tractor trailer was turning left or executing a U turn is still unclear to me.
In any event, my first reaction was that a large vehicle like that should not have pulled out in front of oncoming traffic. Perhaps the truck driver was partially at fault? If you are building an autonomous driving system, how do you program it to anticipate every stupid thing a human being is capable of, whether it is a truck turning into the path of the car or a clueless pedestrian stepping off a curb into traffic while engrossed in the morning newspaper and drinking a flat white latté?
Now, details are beginning to emerge and they are disturbing. First, the driver of the Tesla was Joshua D. Brown, of Canton, Ohio. He was 40 years old and a Navy SEAL for 11 years. He left the Navy in 2008, according to the Pentagon. He was a very tech-savvy fellow. He was the founder of his own internet network and camera company, according to a report in the Dallas Morning News. He posted videos about his Tesla experiences on several occasions, including one showing how the Autopilot system in his car once saved him from a potentially dangerous collision when a white commercial truck swerved suddenly into his lane.
A woman driving on the same highway and in the same direction as Brown claims she was driving 85 mph when Brown’s Model S flew by her at a high rate of speed. That information is not included in the official traffic accident report filed by the Florida State Police, but is surely something known to Tesla, as it has access to all of the data stored in the car’s computer system. It is included in a story at Teslarati.
The driver of the tractor trailer, Frank Baressi, age 62, told the Dallas Morning News in a telephone interview that the Tesla driver was “playing Harry Potter on the TV screen.” He added, “It was still playing when he died.” Baressi said, “he went so fast through my trailer I didn’t see him.” He didn’t see the video playing, but claims he could hear the movie still playing when the Tesla finally came to a stop several hundred yards up the road. Tesla Motors says it is not possible to watch videos on the Model S touchscreen, but word is that Brown had a portable DVD player in the car.
The really scary part is that not only did the sensors in Brown’s car fail to detect the tractor trailer directly in front of it, the car itself continued to drive down the highway for several hundred yards after its roof was sheared off. It finally came to a stop in the yard of a home owned by Bobby Vankavelaar. He told ABC Action News that the Tesla traveled “hundreds of yards from the point of impact, through a fence into an open field, through another fence and then avoided a bank of trees before being unable to swerve and miss a power pole that eventually stopped the car a few feet away.”
So, is this a ‘perfect storm’ event? Someone driving much too fast while distracted, coupled with a truck driver who looked but didn’t see a car approaching in the opposite direction, in unfortunate lighting for a white truck? And what impact will this have on the advent of autonomous driving technology? Mike Harley, an analyst at Kelley Blue Book, says systems like Tesla’s, which rely heavily on cameras, “aren’t sophisticated enough to overcome blindness from bright or low contrast light.” He says to expect more deaths as the autonomous technology is refined.
Karl Brauer, a senior analyst with Kelley Blue Book, said the crash is a huge blow to Tesla’s reputation. “They have been touting their safety and they have been touting their advanced technology,” he said. “This situation flies in the face of both.”
Really, Karl? We said in a previous post that fatalities will continue to occur even when more cars start driving themselves. The difference is that, statistically, the likelihood of a fatal accident will be less for autonomous cars than cars operated by human drivers. This unfortunate incident occurred after Teslas worldwide had accumulated more than 130 million fatality-free miles while driving in autonomous mode. Statisticians say a death every 100 million miles is normal. So, the Tesla Autopilot system is already 30% more likely to save your life than if you are driving yourself unaided by computers.
Elon Musk continues to remind people that Autopilot is still only an aid. Drivers must remain alert, aware, and ready to take control at any time. My wife says autonomous cars are like boats towing water skiers. Most states require there be two people in the boat — one to steer and one to watch the skiers. She thinks if you are going to use computers to drive your car while you nap or watch videos, another responsible adult should be required to watch the road ahead.
Her idea is not as fanciful as it may sound. At the beginning of the automotive age, drivers entering a city were required to have a person on foot walk in front of the car sounding a klaxon to warn the citizenry that a motorized vehicle was approaching. Will regulators require something similar now that we know our machines can fail to protect us from all danger?
My guess is that the impetus of technology cannot be denied. Absent deliberate malfeasance on the part of a manufacturer, the world will accept that there is always an element of risk. The technology will get better. Within a few years, when IHS Automotive says 20 million autonomous cars a year will be sold every year worldwide, the chances of a fatal accident occurring in a self-driving car will probably be closer to once every 500 million miles.
Tesla is very lucky this was a single car accident. If another driver had died as a result of being impaled by a speeding Tesla with its roof sheared off, the odds are an army of trial lawyers would descend on the victim’s family, begging for a chance to be the first to sue Tesla. In the end, the fate of autonomous driving technology may not be determined as much by regulators as by the courts.
Screenshot via Teslarati
This post was retroactively sponsored by Nifty Marketing