UPDATED New Technology Opens Gray Areas: The Tesla Autopilot Crash

Experienced drivers know "cruise control" has been a setting offered in various cars beginning in the 1980's, that drivers use on long stretches of straight highways or roads. Cruise control is a primitive form of autopilot, and today with both the Tesla Model X and self-driving cars on California roads, it's a reasonable time to review and extend common usability terms.

I. Cruise Control
II. Tesla's Autopilot Crash
A. Steering Autopilot
B. Chain of Command - Passive or Active Autopilot Deactivation
C. Communicating Chain of Command
D. Communicating Chain of Command - Speed & Steering "My Landcraft" "Your Landcraft"
E. Ghost Cars


I. Cruise Control
The usability is incredibly simple. To set it, you keep your foot on the gas until the speedometer shows the speed you want the car to sustain, then you manually press that car's "cruise control" activation button on the car's dashboard or steering mount.

With the setting activated, you the driver can take your foot off the gas without causing the car to lose speed.

If traffic ahead piles up and you need to slow the car, tapping the break or the gas pedal with your foot deactivates cruise control and the car is no longer driving on autopilot of any kind.

We could call cruise control "speed autopliot" or *primitive speed autopilot*.

Apportioning responsibility between a) human pilot and b) *primitive speed autopilot* is clear, unbelievably clear by today's standards. If the cruise control is set at 65 miles per hour, but the car lurches up to 70, the technology (and potentially its manufacturer) is at fault. If an obstruction falls off a truck 20 feet ahead and the driver fails to pump the breaks and there is impact, the car sustained speed as it was supposed to and the driver is at fault.


II. Tesla's Autopilot Crash
An advanced Tesla car "Tesla Model X" with extended autopilot features owned by a 38-year-old Apple engineer crashed last Friday. The accident killed father of two Walter Huang on Highway 101 in Mountain View.

A. Steering Autopilot
Based on reports from my friends who drove these before Friday, this model had impressive course-correcting "steering autopilot" technology.

An abc7 news reporter investigating the Mountain View Tesla crash relayed communication from the driver's brother that he'd complained to family his new car's *steering autopilot* veered to the left between 7-10 times on a specific highway segment. Huang had taken the car to the dealer to be fixed but they did not find a problem, his brother said.

An experienced driver sees things a certain way - and asks these questions about usability of these autopilot Teslas' "steering autopilot" "speed autopilot" and "seeing autopilot" features.

B. Chain of Command - Passive or Active Autopilot Deactivation
How does a driver decide when he's supposed to take command of the steering? The driver's hands are supposed to remain on the wheel for the autopilot to work correctly, the news report said. Is *steering autopilot* passively deactivated when the human pilot steers the car? Or does a human pilot intentionally deactivate steering autopilot by the press of a button? How does he put steering back on autopilot?

C. Communicating Chain of Command
We all know how scary it is when a teenager grabs control of the steering wheel of a car we're driving in a town or city while we're still controlling the gas pedal and breaks with our feet. "What are you doing? Stop that or get out of the car."

On the other hand, it's a thrill as a kid when an elder lets you sit in Aunt Jackie's lap and steer the truck on a wide open road while she pedals the acceleration and breaks.

It's a mutual understanding when a husband asks a wife to grab the steering from the passenger seat on a straight sparse stretch of highway to briefly free his hands to retrieve an item from his pockets or the car floor.

With both Aunt Jackie and the husband-and-wife, chain of command was clearly communicated, and they split command of steering + driving between two separate pilots only on predictable terrain.

D. Communicating Chain of Command - Speed & Steering "My Landcraft" "Your Landcraft"
How does the driver know when to take command of the steering and speed functions of Tesla Model X at the same time? How do the car and human pilot communicate chain of command - tell the Tesla "my aircraft" and wait for the Tesla to respond "your aircraft" like Tom Hanks in "Sully" before assuming command of the car?

E. Ghost Cars
It doesn't make sense for a human pilot to take over *seeing autopilot* duties since steering, accelerating, decelerating and breaking are active motions, whereas seeing is a passive detection. However you could imagine a car auto-detecting mud splashing up and obstructing its sensors, and alerting the driver so the human pilot can assume control of the car.

This term "ghost cars" is mentioned in the second abc7 news segment when a Tesla Model X owner drives the reporter to the crash site in the same vehicle model. The driver says the car will sometimes detect "ghost cars" that are not really there and suddenly pump the breaks. The driver also told reporter Dan Noyes "if the car sees a white line, it might think that's a lane" instead of the barrier that Huang crashed into and died.

Said Noyes, "notice the break between asphalt and concrete, and those two white lines. Could they have guided Walter's autopilot right into the barrier?"


UPDATE 4/3/2018: To be clear, Tesla never defined the usability terms. The above defined terms a) describe how an experienced human driver sees the vehicle and road pre-Model X and b) will be used by some drivers to navigate vehicle and road post-Model X.

Also Walter Huang's brother Will never claimed Walter used the term "steering autopilot" which is why this writer enclosed the term in asterisks, not quotes, in the hyperlinked portion above: ("relayed communication from the driver's brother that he'd complained to family his new car's *steering autopilot* veered to the left".) Will Huang told abc7news in a text Walter had complained to him that "seven to ten times" the car veered left at a specific highway segment.

The surviving brother Will Huang texted abc7news this: "7/10 times the car would swivel toward that same exact barrier/During auto pilot/Walter took it into dealership addressing the issue/But they couldn't duplicate it there."

Was it an action motion issue (*steering autopilot* *speed autopilot*), a perceiving issue (*motion sensors*) a navigation issue (*sensor navigation* *gps navigation*) a controller issue (did *sensor navigation* properly override *gps navigation*) or a chain of command issue ("My Aircraft" "Your Aircraft" like in Sully.)

A human life is precious. Be persistent with the dealer. Be persistent with the customer. Articulate terms, risk being "uncool" to get on the same page. Call the customer at home. Be so persistent you risk a harassment lawsuit from the dealer and customer, respectively.

CORRECTIONS 4/3/2018: The crash took place March 23. "Cruise control" features were available as early as the 1980s.

-------------
Further Reading:

Reporter rides a Tesla Model X to the crash site where Apple engineer and father of two Walter Huang crashed and died. "If the car sees a white line, it might think that's a lane" instead of the barrier Wong crashed into and died, Noyes' source said.   abc7news.com

4/3/2018: National Transportation Safety Board will take months to complete the investigation then a few weeks to publish its report. NTSB did something "highly unusual" by issuing Sunday a statement they're "unhappy" with Tesla, for disclosing in a company blog post their side of the story, and the information they could provide to the public.   cnbc.com

4/3/2018: Tesla's first blog post published on March 27 tesla.com/blog

4/3/2018: Tesla's second blog post published on March 30 tesla.com/blog which sparked the rebuke from NTSB Sunday April 1.







This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Popular posts from this blog

60 Minutes Segment From May 2017 - How to Fire Proof a Home

Why Ad Tech Can't Build Brands (Yet)

DrawDown #4: MicroGrids and Industrial Recycling