What’s the Rush? Are we moving too fast on autonomous vehicles?
Posted: April 10, 2018 by Rolf Lockwood
Perspective is what prevents us from making lousy decisions and watching our stress levels rise as we try to cope with the fallout from those mistakes and misperceptions. Never before, I would argue, has perspective been as necessary as it is now. In life at large and certainly in this industry of ours.
Fear of change is largely built on a failing of perspective, and we’re obviously in the midst of massive changes. We can’t afford to be afraid of them. We also can’t afford to be cavalier.
Which brings me to the testing and occasional use of autonomous and semi-autonomous vehicles on public roads. So far it’s been mostly cars, but heavy trucks have been out there pretty often, too. And in a couple of recent cases, mayhem ensued.
First we had the Arizona fatality involving an Uber taxi in autonomous mode. The facts are a bit sketchy, but in a sense they don’t matter. It’s about the optics.
The vehicle was an SUV traveling at 65 km/h, apparently in Level 4 autonomous mode, and simply failed to “see” a woman suddenly step onto the roadway, at night. A driver was present, though not actively driving. Confusing the issue somewhat, the SUV’s native collision-avoidance system had apparently been turned off in favor of Uber’s own technology.
I’d venture a guess that autonomy actually has little to do with this; that nothing could have prevented the woman’s death. There simply wasn’t time for any reaction, human or otherwise.
More recently a California man in a Tesla X running in Autopilot mode died when the car struck one of those awful concrete lane dividers. The man’s hands had apparently been off the wheel for at least six seconds despite warnings from the car. Did its systems fail? Or is this essentially a new variation on the theme of driver error? Nothing is clear.
Regardless, whatever trust had been built up around the idea of vehicular automation has now been severely damaged by these incidents. That was bound to happen at some point, but Uber was probably right to suspend its autonomous testing after the Arizona tragedy, even though I don’t think its autonomous technology failed. Optics again. The public seems to have little confidence in the autonomous idea in cars, and a lot less when it comes to trucks. It will take time to restore the average person’s willingness to entertain the concept of vehicles driving themselves.
There’s no surprise there, and this really isn’t a setback for proponents of automation, because it was never seen as any kind of slam dunk. The technology is well-advanced, though clearly imperfect, but the social and legal aspects of this were always going to be the bigger challenges by a very wide margin. In a sense, then, nothing has changed.
Not surprisingly, there are calls for more rigorous testing of autonomous technology before such vehicles are let loose on public roads. Even less surprisingly, one of those calls comes from the American Center for Mobility in Willow Run, Mich. It’s a non-profit testing and product development facility designed, in its own words, “to enable safe validation of connected and automated vehicle technology, and accelerate the development of voluntary standards.”
California, on the other hand, is steadily making on-the-road testing easier.
So who’s right? Should we be more cautious than we’ve been so far? I tend to think so, not because of the recent fatalities. I simply think we’re moving too fast.
I’m certainly not afraid of change, and not of the autonomous one in particular, but I really do think we’re being cavalier. It forces me to ask, what’s the rush?