What a Navy Pilot Says About Driverless Cars, 'Autopilot' Mode and 'Mode Confusion'
It's been a few months since driverless cars or "autonomous vehicles" (AVs) have made headlines, but some developments since then provide context and give the public some terms to use as we share roads with these heavy fast-moving machines in the future.
One development is a broadcast of an interview with Duke University's robotics lab director advocating driverless cars should pass a vision test. Up to 50 separate AV manufacturing companies are creating driverless cars. An autonomous vehicle "sees" the terrain with a combination 1) radar, 2) lidar (Light Detection And Ranging), and 3) ultrasound. A "complex data fusion" technology is required to tie all three components together, but that fusion technology isn't subject to vision tests in California the way human drivers are.
In May, tech reporter Molly Wood interviewed former Navy Pilot, and current Duke University HAL Director (HAL stands for Humans and Autonomy Lab, the former MIT lab that moved to Duke U) Missy Cummings. Wood asked Cummings about autonomous vehicles and semi-autonomous vehicles and NPR broadcast the interview on May 2 for Marketplace Tech radio program. Below is a transcript of that interview.
The Duke HAL lab published in 2017 a newsletter forecasting the next year's developments in Autonomous Vehicles, outlining these terms as well as "mode confusion" which this site introduced in concept, but not in name, back in April. The 2017 Duke University HAL Lab newsletter states:
The explicit consent rule is in the Code of Federal Regulations, Title 45, Part 46, "Protection of Human Subjects" and the regulation's exact wording can be read at HHS here.
--------------------
Further Reading:
Autonomous Vehicles Levels 0-5: Understanding the Differences: techrepublic.com
Ford AV expert Jim McBride agrees that level 3 is problematic. "He's focused on getting Ford straight to Level 4, since Level 3, which involves transferring control from car to human, can often pose difficulties. 'We're not going to ask the driver to instantaneously intervene—that's not a fair proposition.'" techrepublic.com
Driverless Cars Start Today; Rand Says Unready Cars, Injuries & Fatalities Justified: www.offlinereport.net
New Technology Opens Gray Areas: The Tesla Autopilot Crash: www.offlinereport.net
This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
One development is a broadcast of an interview with Duke University's robotics lab director advocating driverless cars should pass a vision test. Up to 50 separate AV manufacturing companies are creating driverless cars. An autonomous vehicle "sees" the terrain with a combination 1) radar, 2) lidar (Light Detection And Ranging), and 3) ultrasound. A "complex data fusion" technology is required to tie all three components together, but that fusion technology isn't subject to vision tests in California the way human drivers are.
In May, tech reporter Molly Wood interviewed former Navy Pilot, and current Duke University HAL Director (HAL stands for Humans and Autonomy Lab, the former MIT lab that moved to Duke U) Missy Cummings. Wood asked Cummings about autonomous vehicles and semi-autonomous vehicles and NPR broadcast the interview on May 2 for Marketplace Tech radio program. Below is a transcript of that interview.
***(Voiceover): M. Wood: If we have the chance to stop paying attention, we will. In accidents, two of them fatal, involving Tesla's autopilot system it appears the drivers didn't necessarily take over when the semi-autonomous system wasn't up to the task.Wood asked Cummings if drivers of non-autonomous vehicles going about their daily lives should be part of the "test environment" -- a test environment on public streets with no consent from drivers -- in order to advance the machine-learned behaviors of the semi-autonomous vehicles, as they were set to be in California back in April 2018?
Missy Cummings directs the humans and autonomy lab (HAL) at Duke University and, as it happens, was one of the U.S. Navy's first female fighter pilots.
I asked her what was between now and fully self-driving cars.***
M. Cummings: Well, I think that, one of the problems with these levels are that they seem linear in that we should go [level] 1, 2, 3, 4, 5. But the reality is there are really two different paths. There's the [level] 1, 2, 3 path and then there's the [level] 1, 2, 4, 5 path.
And the reason that this is an issue is because level 3, which is where the automation is partially capable but not fully capable.
And we have to have that in the cases where the cars can't perform under all conditions. Then the car hands over control back to the human. And this is the deadliest phase. And in fact I'm pretty much against level 3, I don't think it should exist at all. Because one thing I know as a former fighter pilot: having a human step inside the control loop at the last possible minute is a guaranteed disaster.
M. Wood: And yet we see car makers, going there. We arguably see Tesla sort of pushing that. Does it make drivers part of like, a living R&D lab on city streets?Next Wood asks Cummings how the Duke HAL lab recommends manufacturers should notify customers, who are the drivers of semi-autonomous vehicles, when these cars are "ready" to use?
M. Cummings: Well I think there are two issues here. I think number one should we have cars that have to ask for humans' intervention at time-critical periods? That answer is no.
And I think but that's a separate issue of should we allow car makers to use the American public as guinea pigs to test out these new technologies? And I also think that that answer is no.
M. Wood: How do automakers need to communicate to car owners when they're safe to use?
M. Cummings: Well I think this is a real problem. Because, in the aviation world when we put fancy new automated technologies -- which are actually not as complex as ones that exist in cars right now, which I think is a big surprise to most people -- we make commercial pilots go through years of training. They have annual check rides. They have to show that they know the rules. They have to take a flight exam with a flight examiner to show that they can operate the vehicle under conditions of uncertainty.
We're not doing that for human drivers, right?
You know this is a real issue. Because if you have these complex modes of operation -- which we know exist in these driverless cars, or even driver-assist cars -- and we know people aren't reading the manuals, then arguably are we doing the ethical thing by allowing them to be on the roads? And I think the answer is no.
M. Wood: So what needs to happen, setting aside what we think might, or might not, in the current environment. What *does* need to happen from NHTSA or any other officials?
M. Cummings: So the reality is, we still need to invest in this technology. I'm not a Luddite, I run a robotics lab. I want the technology to continue to progress. And so one of the things that I personally have been advocating for is that we need to establish vision tests for AVs. One of the things that we know about driverless cars are that their perception systems, what makes them see, are deeply flawed. So if we know that the perception systems are deeply flawed, then we need to have a set of tests that the cars must be able to pass before they're allowed on public roads.
***(Voiceover) M. Wood: M. Cummings directs the humans and autonomy lab (HAL) at Duke University. Earlier this week a Tesla driver in England lost his license for 18 months for putting his car in autopilot mode, and then moving over to the passenger seat. I'm Molly Wood and that is Marketplace Tech.***
The Duke HAL lab published in 2017 a newsletter forecasting the next year's developments in Autonomous Vehicles, outlining these terms as well as "mode confusion" which this site introduced in concept, but not in name, back in April. The 2017 Duke University HAL Lab newsletter states:
Tesla maintains that both drivers were at fault for not paying attention to Autopilot, technically a “driver assist” technology in a car that is not intended to be fully autonomous. Apparently the drivers did not understand this nuance—perhaps they were misled by the name Autopilot. Similar examples of “mode confusion” are well known in aviation and will only increase as automation becomes more prevalent in cars. These two fatalities highlight several issues raised before the Senate Commerce Committee in March 2016—namely that the entry of driverless cars into the market may reveal many unknowns, but in the meanwhile, manufacturers are failing to address many problems that are known (3). For example, Tesla knew about the inability of Autopilot to detect static objects on highways, and the owner’s manual warned drivers that the car may not brake for stationary vehicles, especially when the car is driving faster than 50 mph.4There is a federal regulation that mandates all humans involved in an experiment should explicitly give their consent, which they were not doing in San Francisco back in April when driverless cars were set to start driving on public roads. But as this site said back in April new technology opens new gray areas, and neither authorities nor the public had illustrated in sharp relief that pedestrians, bicyclists and drivers sharing roads with "test" automatic vehicles were involved in an experiment. There was so much trust in high tech companies in early April that even their "tests" were deemed more trustworthy than licensed humans. A lot has changed (see Driverless Cars Start Today; Rand Says Unready Cars, Injuries & Fatalities Justified at this site.)
A significant flaw in the car’s perception system — that is, how the car “sees” the world — and the lack of transparency to the drivers led to these two fatalities, and these are problems not easily solved.
The explicit consent rule is in the Code of Federal Regulations, Title 45, Part 46, "Protection of Human Subjects" and the regulation's exact wording can be read at HHS here.
--------------------
Further Reading:
Autonomous Vehicles Levels 0-5: Understanding the Differences: techrepublic.com
Ford AV expert Jim McBride agrees that level 3 is problematic. "He's focused on getting Ford straight to Level 4, since Level 3, which involves transferring control from car to human, can often pose difficulties. 'We're not going to ask the driver to instantaneously intervene—that's not a fair proposition.'" techrepublic.com
Driverless Cars Start Today; Rand Says Unready Cars, Injuries & Fatalities Justified: www.offlinereport.net
New Technology Opens Gray Areas: The Tesla Autopilot Crash: www.offlinereport.net
This work by AJ Fish is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.