Co-authored with Scott Hoogerwerf, Microsoft Corporation
The development of new AI applications has the potential to bring enormous improvements to every sector of our society. However, as AI plays an ever-larger role in our society, the harmful impacts of poorly implemented systems can increase. Corporations and educational institutions can work together to help ensure that new technologies are deployed in a responsible manner.
See the full article in the Puget Sound Business Journal.
Co-authored with Nathan Colaner.
The creator of FakeApp says it would be wrong to condemn the technology. How could we not? Read More
The rideshare economy is characterized by a glut of workers and falling wages. We’ve seen this happen before. Should the government intervene? Read More
“I spend probably 1-2 hours per week on my job for which I am getting a full time wage” .
The anonymous person (whom we’ll call Eve) making this post to The Workplace website explains that she was hired as a programmer to support a legacy system. Her job is to take a batch of requirements, stored as data in spreadsheets, and write SQL scripts to configure the system based on the requirements. It’s a complicated process, and the analysts creating the spreadsheets “spend a fair bit of time verifying” Eve’s work to ensure that the SQL scripts are correct “because the process is so tedious that it’s easy to make a mistake” . Although it’s boring work, it is a full-time job with a good salary, and it allows Eve to work from home and take care of her son.
It took Eve about a year to figure out all the complications and write software that can remove errors from the spreadsheet and produce the SQL scripts. She can now do in 10 minutes what took the previous employee a month to do. When Eve gets a new set of spreadsheets, she quickly produces the scripts. Every week, she tells her employer that she’s completed another part of the job and asks the analysts to verify the SQL scripts. She inserts “a few bugs here and there to make it look like it’s been generated by a human” . The company has never indicated any dissatisfaction with her job performance.
The Death of Joshua Brown
In October 2014, Tesla began selling sedans with a $4,250 technology package containing a dozen ultrasonic sensors, a camera, a front radar, and digitally controlled brakes. The package allowed the car to stop before crashing. A year later, Tesla released a software update named Tesla Version 7.0 to the 60,000 cars it had sold with the technology package. The new software enabled the car to control its speed and steer. Tesla gave the software update the nickname Autopilot .
Here is what Tesla wrote on its Web page: “While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car” .
That made Tesla Motors the first automaker to release a product exhibiting 3 automation, as defined by SAE International :
Joshua Brown was a Tesla fanatic. He nicknamed his Model S sedan Tessy, and he averaged more than 5,000 miles per month on the road . Mr. Brown posted YouTube videos showing himself “driving” hands-free and testing the limits of the system [5, 6, 7].
On May 7, 2016, Mr. Brown was killed when the Tesla Model S he was “driving” crashed into a tractor trailer on a Florida highway . Tesla’s first public response to the accident came nearly two months later, on June 30 . I encourage you to read it, here.
Question 1: How much moral responsibility does Tesla Motors carry for the death of Joshua Brown?
Details of the Accident
The accident occurred as Joshua Brown’s Model S was traveling east on US-27A, a divided highway in northern Florida. A tractor trailer, traveling in the opposite direction on the highway, turned left in front of the Tesla. The Tesla was in Autopilot mode. According to Tesla Motors, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied” . The trailer was high enough off the ground that the car continued under the trailer, shearing off its roof. The car drove off the road and struck two fences and a power pole before coming to a stop .
The accident killing Joshua Brown occurred at an at-grade intersection of a divided highway. Divided highways are more dangerous than freeways. On a per-mile basis, the probability of getting into an accident is higher on a divided highway than on a freeway. In fact, the Interstate System of freeways is the safest system of roads in the country .
According to the National Transportation Safety Board, Joshua Brown’s Tesla Model S was traveling 74 miles per hour with Autopilot engaged at the time of the crash with the tractor trailer, 9 miles per hour above the posted speed limit of 65 miles per hour . According to the web site Quartz, Autopilot remained engaged at speeds up to 89 miles per hour .
Question 2: Should Tesla Motors have added restrictions to the beta version of Autopilot so that it could only be activated while driving on freeways?
Question 3: Should Autopilot allow the driver to set a cruising speed above the speed limit, and if so, by how much?
The Hand-off Problem
In 2015, before the Tesla accident, Ford Motor Company announced its plans to introduce a self-driving car by 2021. It also said it was skipping level 3 because of its inherent difficulties. How can the computer ensure the driver is paying enough attention that it can pass over control in case of an emergency? Ford said that its tests indicated it took an average of 3 to 7 seconds, but sometimes as many as 10 seconds, for a driver to take control of the vehicle. This is called the hand-off problem .
Passing over control is even more difficult if the driver is distracted. The Model S sedan Joshua Brown was driving did not have a mechanism to ensure the driver kept attention on the road while Autopilot was engaged. People have observed Tesla sedans traveling while the "driver" sleeps . The Florida Highway Patrol found a portable DVD player in Joshua Brown’s Tesla Model S. Some witnesses said they heard a Harry Potter movie playing when they approached the car after the accident, although other witnesses to the scene of the accident said there was no movie playing .
Ford has publicly announced that it will not sell an automobile with level 3 automation. It does plan to start selling an automobile with level 5 automation in 2021 – a full self-driving car – but it will not have a steering wheel, gas pedal, or brake pedal. Control will never be handed off from the computer to the driver .
Question 4: Should Tesla Motors have released Autopilot to the public when the hand-off problem has not been solved?