Watch Now


Waymo: Human factor key to unlocking AV safety, public trust

“In order to really convince the public, what we need is to be simple"

(Photo credit: Waymo)
  • During kickoff session for the Autonomated Vehicles Symposium, Waymo safety chiefs highlight importance of the human factor in building and deploying self-driving vehicles.
  •  “The public has a hard time trusting what they don’t understand,” said Matthew Schwall, director of field safety.

Alphabet-backed self-driving vehicle company Waymo is seeking to pull back the curtain on some of the company’s safety practices, highlighting the importance of the human factor when building and deploying self-driving vehicles on public roads.

“Even though riders don’t see any people, we have people behind the scenes,” said Qi Hommes, Waymo’s chief of system safety.

Hommes was one of several Waymo directors who participated in a keynote panel discussion that kicked off this week’s virtual Automated Vehicles Symposium, one of the most prominent conferences in the industry.

Widely viewed as one of the AV industry’s leading contenders, Waymo operates a robotaxi service in Arizona and recently announced go-to-market plans for Waymo Via, its self-driving delivery division that equips Class 8 trucks with autonomous technology.


Skepticism takes root

Despite its deep pockets — with investment hitting $3 billion so far in 2020 — and apparent front-runner status, the Silicon Valley company faces a challenging environment as rhetoric around autonomous vehicles shifts from hype to concerns about safety and increasing skepticism about when fully driverless vehicles will actually be deployed.

These and other tensions guided the panel discussion, as the Waymo team talked about the need to balance public trust and collaboration with intellectual property concerns, and the continuing effort to apply novel solutions to what Hommes described as one of  the “most complex” technology systems human beings have ever built.

Convincing the public


Matthew Schwall, Waymo’s head of field safety, highlighted his experience working for an automotive testing firm a decade ago, when Toyota was forced to recall millions of vehicles due to concerns about runaway acceleration. Although a possible cause of the problem, a throttle-by-wire system, was subsequently cleared by NHTSA, public trust in the original equipment manufacturer was lost, Schwall said.

“Toyota serves as a reminder that the public has a hard time trusting what they don’t understand,” he said. For that reason, outreach and transparency around autonomous vehicles is critical.

Asked by moderator Chrris Gerdes, director of Stanford’s Center for Automotive Research, how the company builds trust while protecting intellectual property, Schwall responded: “Fundamentally, in order to really convince the public and be compelling, what we need is to be simple. And the simpler it is, the less IP is an issue.”

For example, to instill confidence among first responders, Waymo has implemented a toll-free number dedicated to that group and created a video explaining how the technology works. In California, where the company runs tests on public roads and Arizona, Waymo brings the vehicles to police and fire stations directly to train workers on the technology.

While no intellectual property is revealed during these trainings, Waymo does open the trunk of the vehicles to first responders, a view not available to the broader public. 

About 5% of Waymo robotaxi rides are fully driverless, while the remainder operate with a safety driver, according to Schwall. “There is no substitute for taking the driver out to learn how the public is going to experience this service.” Most riders are excited, some nervous, and “after a few miles, they relax and go back to what they typically do: looking at their phone.” 

For those who are uneasy about the idea of 2 tons of metal driving itself down the highway, Schwall poses this question: “What would it take for you to have confidence in a young driver? For me what would instill confidence is if they had driven a lot of miles and had a safe driving record.” That’s what Waymo is trying to accomplish.

AV testing: a jumble of acronyms


Simplicity and transparency may be industry buzzwords, but actual behind-the-scenes practices underscore how much complexity and uncertainty still govern AV technology development.

Waymo’s risk management approach aims to account for the human factor in all aspects, said Tracy Murrell, Waymo’s interim head of safety. That means not just operators, but engineers who are building the technology. “If they are fatigued while they are coding,” she said, “errors could come into play.”

To build redundancy into the system, Waymo uses a hazard analysis framework called System Theoretic Process Analysis (STPA), a comprehensive approach that integrates operators, processes, systems and communications, according to Hommes.

For example, early in the STPA analysis, Waymo discovered that its dispatchers might not receive information about crashes in a timely fashion. The dispatchers offer real-time rider support and are responsible for monitoring the location of vehicles at all times. 

The time lag glitch led Waymo to create another notification layer, so that an alert goes to all the dispatchers, not just the one in charge of a given vehicle. That solution is then integrated into software, hardware and operator training. 

Hommes said she is vice chair of an IEEE standards committee that is developing an AV standard focused on “assumptions for inputs going into behavior models.” Throughout this process, “where it makes sense to share and collaborate we are actively doing that.”

First, understand the problem

AV technology poses vastly different problems than those that have been solved before in the safety arena, Hommes said.

Quoting Einstein’s famous maxim — “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and five minutes thinking about solutions” — she issued a challenge to colleagues: 

“Let’s not get into a situation of having a hammer and everything looks like nails. Let’s look at the problem and think about what is different this time: What is the difference between safety and reliability? What are the new roles humans play in Level Four autonomy?

Only then, Hommes said, should AV companies select the right tools and methodologies to craft solutions.

Related stories:

Report: Wide use of self-driving vehicles ‘at least’ a decade away

Click here for more FreightWaves articles by Linda Baker.

Linda Baker, Senior Environment and Technology Reporter

Linda Baker is a FreightWaves senior reporter based in Portland, Oregon. Her beat includes autonomous vehicles, the startup scene, clean trucking, and emissions regulations. Please send tips and story ideas to lbaker@freightwaves.com.