NIAGARA FALLS, ON – The questions surrounding autonomous vehicles are not limited to how such technology would actually work, and Rick Geller has identified a long list of related issues that have yet to be addressed.
“We really need to bring some focus to what it is that we’re trying to accomplish with this technology,” said the vice president – transportation industry leader with Marsh Canada, during a presentation to the Private Motor Truck Council of Canada’s annual meeting.
It’s an important distinction. While some people see autonomous technology as a tool to compensate for a lack of skill or training, others view it as a potential “guard rail” against other threats, he said.
“You have to look at the technology itself, but you also have to look at that broader ecosystem that supports it,” Geller added, referring to required gains in everything from communication to infrastructure. Each could affect the rate of any rollouts. Changes in infrastructure, he observed, can be measured in decades. In the meantime, existing vehicles will be sharing space on the same roads.
There are ethical issues to consider as well. “Who are we designing the system to protect when we’re looking at autonomous vehicles? Because the decision is made by computer,” he says. Picture an autonomous truck traveling in the center lane of a highway when it sees a threat ahead of it. To the left is a motorcycle, and to the right is an SUV. If the computer is programmed to protect the truck driver, it might choose to steer toward the motorcycle. If it is programmed to protect other motorists, it may choose to sacrifice itself against the threat ahead of the truck. “These are very real ethical decisions that need to be made,” he said.
Even the approach to the programming will make a difference. A rule-based system is programmed to execute commands based on specific information. Artificial intelligence, however, learns through experience. If engineers steer toward the latter model, there’s the question of whether a truck has the latest version of the technology, he says.
Drivers will need training, too. “When ABS brakes came out, people believed that vehicles would stop faster. It’s not correct, but that’s what they believed,” Geller said. Any users will need to understand limitations.
For that matter, the advanced technology could cause different skills to erode. “If we’re going to ask technology to start driving the trucks and we’re going to ask people to sit back and go make a sandwich or knit, when you don’t use skills, they degrade,” he said. Drivers could even begin to think of themselves as invincible, taking unwanted risks.
There may also be times when drivers are asked to take over controls, which could present its own set of challenges. In one research project by the U.S. Department of Transportation, tests explored warnings as visual, audible and haptic alerts, as well as countdowns. Shaking seats did the best job at grabbing attention, Geller said, citing the results. But when the countdowns began, most people tried to finish the tasks they were doing before acting on the warnings.
It isn’t the only reason warnings might be needed. Drivers could be confused about the mode a vehicle is operating in, Geller said. They could be surprised by actions that a truck takes.
Insurance presents another challenge of its own after a crash. Auto coverage might give way into product liability coverage. Cases like that can take years to resolve.
As the underlying technology emerges, Geller says fleets should begin developing corporate strategies that identify how they want to leverage the tools. “What’s the business problem you’re trying to solve?” he asked. Without answering that, there is a threat of overbuying and underutilizing the technologies.
“Train everybody on that technology,” he said, offering a final piece of advice. “Let your company know what that strategy is.”