The Death of Joshua Brown
In October 2014, Tesla began selling sedans with a $4,250 technology package containing a dozen ultrasonic sensors, a camera, a front radar, and digitally controlled brakes. The package allowed the car to stop before crashing. A year later, Tesla released a software update named Tesla Version 7.0 to the 60,000 cars it had sold with the technology package. The new software enabled the car to control its speed and steer. Tesla gave the software update the nickname Autopilot [1]. Here is what Tesla wrote on its Web page: “While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car” [2]. That made Tesla Motors the first automaker to release a product exhibiting 3 automation, as defined by SAE International [3]:
Joshua Brown was a Tesla fanatic. He nicknamed his Model S sedan Tessy, and he averaged more than 5,000 miles per month on the road [4]. Mr. Brown posted YouTube videos showing himself “driving” hands-free and testing the limits of the system [5, 6, 7]. On May 7, 2016, Mr. Brown was killed when the Tesla Model S he was “driving” crashed into a tractor trailer on a Florida highway [8]. Tesla’s first public response to the accident came nearly two months later, on June 30 [9]. I encourage you to read it, here. Question 1: How much moral responsibility does Tesla Motors carry for the death of Joshua Brown? Details of the Accident The accident occurred as Joshua Brown’s Model S was traveling east on US-27A, a divided highway in northern Florida. A tractor trailer, traveling in the opposite direction on the highway, turned left in front of the Tesla. The Tesla was in Autopilot mode. According to Tesla Motors, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied” [9]. The trailer was high enough off the ground that the car continued under the trailer, shearing off its roof. The car drove off the road and struck two fences and a power pole before coming to a stop [8]. The accident killing Joshua Brown occurred at an at-grade intersection of a divided highway. Divided highways are more dangerous than freeways. On a per-mile basis, the probability of getting into an accident is higher on a divided highway than on a freeway. In fact, the Interstate System of freeways is the safest system of roads in the country [10]. According to the National Transportation Safety Board, Joshua Brown’s Tesla Model S was traveling 74 miles per hour with Autopilot engaged at the time of the crash with the tractor trailer, 9 miles per hour above the posted speed limit of 65 miles per hour [11]. According to the web site Quartz, Autopilot remained engaged at speeds up to 89 miles per hour [12]. Question 2: Should Tesla Motors have added restrictions to the beta version of Autopilot so that it could only be activated while driving on freeways? Question 3: Should Autopilot allow the driver to set a cruising speed above the speed limit, and if so, by how much? The Hand-off Problem In 2015, before the Tesla accident, Ford Motor Company announced its plans to introduce a self-driving car by 2021. It also said it was skipping level 3 because of its inherent difficulties. How can the computer ensure the driver is paying enough attention that it can pass over control in case of an emergency? Ford said that its tests indicated it took an average of 3 to 7 seconds, but sometimes as many as 10 seconds, for a driver to take control of the vehicle. This is called the hand-off problem [13]. Passing over control is even more difficult if the driver is distracted. The Model S sedan Joshua Brown was driving did not have a mechanism to ensure the driver kept attention on the road while Autopilot was engaged. People have observed Tesla sedans traveling while the "driver" sleeps [14]. The Florida Highway Patrol found a portable DVD player in Joshua Brown’s Tesla Model S. Some witnesses said they heard a Harry Potter movie playing when they approached the car after the accident, although other witnesses to the scene of the accident said there was no movie playing [15]. Ford has publicly announced that it will not sell an automobile with level 3 automation. It does plan to start selling an automobile with level 5 automation in 2021 – a full self-driving car – but it will not have a steering wheel, gas pedal, or brake pedal. Control will never be handed off from the computer to the driver [13]. Question 4: Should Tesla Motors have released Autopilot to the public when the hand-off problem has not been solved? References
1 Comment
9/23/2017 04:57:31 am
In my opinion, the company and the driver who got killed was at fault. The company's fault is that they were confident enough to release their product to the public that it is already perfect and safe to use. The fault of the driver is that he was very calm about driving in autopilot and was not alert of his surroundings, which resulted in his death. This is one of the reasons why I don't want any autopilot mode on cars because it leaves drivers too calm about their surroundings. We should just keep it manual and safe.
Reply
Leave a Reply. |
AuthorMichael J. Quinn formerly served as Dean of the College of Science and Engineering at Seattle University and as a computer science professor at Oregon State University and the University of New Hampshire. Archives
June 2021
Categories |