Today's Trucking
opinions Rolf Lockwood

Is Autonomy Necessary? The answer might be ‘yes’.

Posted: July 13, 2018 by Rolf Lockwood

How on earth can distracted human driving be the cause of a crash involving a vehicle under autonomous control? Sounds like a good question, doesn’t it? But that’s exactly what happened last March in the accident involving an Uber “taxi” in Tempe, Ariz.

In my May column (“What’s the rush?”) I wrote about that fatality, and while I said the facts were “a bit sketchy,” I suggested that autonomy likely had little to do with it, that nothing could have prevented the woman’s death. There simply wasn’t time for any reaction, human or otherwise, I guessed.

Well, turns out I shouldn’t guess. I was just plain wrong. Actually, half wrong.

There actually was time for the driver to react according to a 300-page report recently released by the Tempe Police Department. And in fact the report blames the crash on distracted driving. Sound familiar?

To remind you of the circumstances, the Uber vehicle — a Volvo XC90 SUV — was doing 70 km/h on a multi-lane roadway at night, apparently in Level 4 autonomous mode, and simply failed to “see” Elaine Herzberg crossing the road while walking her bicycle. She was jaywalking. A so-called ‘backup’ human driver — Rafaela Vasquez — was present, though not actively driving. Worse than that, the police report says the driver was watching The Voice on a cell phone, and in the 20 minutes or so before the crash, her eyes were off the road some 32% of the time.

Vasquez saw the woman crossing the road only half a second before impact. That’s clear in an in-car video the police released on Twitter, but they say she could have seen the victim 143 feet away and stopped the Volvo some 43 feet before impact. If she’d been paying attention. Simply, she was willfully distracted.

Confusing the issue, the vehicle’s native collision-avoidance system had been partly disabled, according to a report by the National Transportation Safety Board (NTSB). It “saw” Herzberg with six seconds to spare but did not automatically apply the brakes as it would ordinarily do. Nor did it issue a warning to the driver. The automatic braking function was turned off, the NTSB report said, “to reduce the potential for erratic vehicle behavior.” It depended on human intervention. And therein lies Uber’s big mistake, it would seem, alongside an apparent failure to hire smart drivers.

So I was pretty much correct in writing last May that autonomy itself wasn’t to blame here, rather its management. What I didn’t see was the egregious human error.

What does all that really tell us? That humans can’t be trusted, but we knew that. After all, that’s the basic justification for building autonomous vehicles at any level in the first place. We now know that distracted driving can be an issue even at high levels of autonomy like Level 4, which can run the vehicle on its own but requires “supervision” by human means. It assumes the human is diligent, and that can obviously be a stretch. But does semi-autonomy get us anywhere if that’s the case? Frankly, I think it may actually justify Level 5 autonomy — completely driverless vehicles.

That seems to be the view of Michael Ruf, head of Continental’s commercial vehicle and aftermarket business units in Germany. By no means just a tiremaker, the company actually has 7,000 employees working exclusively on emerging technologies like vehicle telematics, big data, and active safety systems

At a recent press event in Frankfurt Ruf predicted that Level 5 autonomous commercial vehicles will be in limited operation by 2028, following an initial introduction to autonomous technology via platooning operations, which he expects to see in routine operation within five years or so.

That’s an aggressive timeframe. Do you agree?

Share

  • This field is for validation purposes and should be left unchanged.