ADVERTISEMENT

Someday Your Self-Driving Car Will Pull Over for Police

Someday Your Self-Driving Car Will Pull Over for Police

(Bloomberg) -- It was still dark on a Friday morning in November when a California Highway Patrol officer started following a Tesla Model S on Route 101 between the San Francisco International Airport and Palo Alto. The gray sedan was going 70 miles per hour with a turn signal blinking, cruising past multiple exits. The officer pulled up alongside and saw the driver in a head-slumped posture. Lights and sirens failed to rouse him. The car, the officer guessed, was driving itself under the control of what Tesla calls Autopilot.

Every Tesla is equipped with hardware that the automaker says will enable its vehicles to be capable of driving themselves on entire trips, from parking space to parking space, with no input from the driver. At the moment, the company limits its cars to a system that can guide them from on-ramp to off-ramp on highways. The system is smart enough, it seems, to keep the Tesla driving safely even with a seemingly incapacitated driver, but not yet smart enough to obey police sirens and pull over. 

Someday Your Self-Driving Car Will Pull Over for Police

This case appears to be the first time law enforcement has stopped a vehicle on an open road under the control of an automated system. There was no way for police to commandeer the driving software, so they improvised a way to manipulate Tesla’s safety programming. A highway patrol car blocked traffic from behind while the officer following the Tesla pulled in front and began to slow down until both cars came to a stop.  

The incident encapsulates both the high hopes and deep anxieties of the driverless future. The Tesla’s driver, a 45-year-old Los Altos man, failed a field sobriety test, according to the police, and has been charged with driving under the influence; a trial is scheduled for May. The car, which seems to have navigated about 10 miles of nighttime highway driving without the aid of a human, may well have saved a drunk driver from harming himself or others. Neither Tesla nor the police, however, are ready for people to begin relying on the technology in this way. 

Drivers, according to Tesla’s disclaimer, are supposed to remain “alert and active” when using Autopilot and be prepared to take over control if, for instance, the police approach. If the car doesn’t sense hands on the wheel, it’s supposed to slow to a stop and put on its hazard lights. Two days after the incident, Tesla Inc. Chief Executive Officer Elon Musk tweeted that he was “looking into what happened here.” A company spokesperson referred back to Musk’s tweet and declined to share anything the company had learned from the car’s data log.

“My guess as to when we would think it is safe for somebody to essentially fall asleep and wake up at their destination? Probably towards the end of next year,” Musk told the podcast “For Your Innovation” in an episode released on Monday.

The cops who stopped the Tesla that night had never used the technique before. It’s not part of their training. It just so happened that they knew enough about the Tesla to improvise a response. “That’s great adaptation,” says Lieutenant Saul Jaeger of the nearby Mountain View Police Department. Such familiarity is to be expected, perhaps, in the heart of Silicon Valley—the car stopped halfway between the headquarters of Facebook and Google—but relying on the quick wits of law enforcement is not a scalable plan. 

Robots can’t take control of the roads until automakers, engineers, lawmakers and police work through a series of thorny problems: How can a cop pull over an autonomous car? What should robot drivers do after a collision? How do you program a vehicle to recognize human authorities?

Five years ago a roboticist at the Massachusetts Institute of Technology named John Leonard began taking dashcam videos on his drives around Boston. He was looking to catalog moments that would be difficult for artificial intelligence to navigate. One night he saw a police officer step into a busy intersection to block oncoming traffic and allow pedestrians to cross against the light. He added that to his list.

Of all the challenges facing self-driving technology, what-if instances like these are among the most daunting and a big part of the reason truly driverless cars are going to arrive “more slowly than many in the industry predicted,” says Leonard. He’s in a position to know: Leonard took leave from MIT in 2016 to join the Toyota Research Institute and help lead the automaker’s AV efforts.

Waymo LLC, the autonomous-driving startup launched by Google’s parent company and now serving passengers in the Phoenix area, has already run up against almost the exact scenario that worried Leonard. In January a sensor-laden Chrysler Pacifica minivan in Waymo’s automated ride-hailing fleet rolled up to a darkened stoplight in Tempe, Arizona. The power had gone out, and a police officer stood in the roadway directing traffic. In dashcam footage alongside a rendering of the computer vision provided by Waymo, the minivan stops at the intersection, waits for the cross traffic and a left-turning car coming the other way, then proceeds when waved through by the officer.

A Waymo spokeswoman, Alexis Georgeson, says the company’s fleet can distinguish between civilians and police standing in the roadway and can follow hand signals. “They will yield and respond based on recognition that it is a police officer,” she says. “Our cars do really well navigating construction zones and responding to uniformed officers.”

Waymo is taking a territorial approach to autonomous vehicles, focusing on developing fleets that could serve as taxis in limited areas and stopping short of full, go-anywhere autonomy, a not-yet-reached threshold known in the industry as Level 5. Working in a limited space allows both for building detailed maps and for easier coordination with government and law enforcement. Rather than trying to launch across different jurisdictions, Waymo picked Chandler, a Phoenix suburb with wide avenues, sunny weather and welcoming state and local governments, for its first living laboratory. Many of its competitors are taking a similar approach, focusing on fleets that would stay within defined boundaries. Ford Motor Co. is testing in Miami and Washington, D.C.; General Motors Co.’s Cruise, Zoox Inc. and Toyota Motor Corp. are among the dozens of companies that have autonomous cars testing on roads in California.   

In the summer of 2017, about a year and half before the debut of its initial ride-hailing service in Chandler, Waymo invited local police, firefighters and ambulance personnel to a day of testing in which trucks and patrol cars—sirens blaring and lights flashing—approached the driverless minivans from all angles on a closed course. “We’ve had a lot of interaction with their personnel on the research and development of their technology,” says Chandler spokesman Matt Burdick. 

Last year, Waymo became the first AV maker to publish a law enforcement interaction protocol. If one of its self-driving cars detects police behind it with lights flashing, the document says, it’s “designed to pull over and stop when it finds a safe place to do so.”

Jeff West, a battalion chief with the Chandler Fire Department, says the Waymo vehicles he’s seen on the road have been quicker to move out of the way than many human-driven cars. “Once it recognizes us, it pulls over,” he says, “versus someone maybe listening to a radio, or they got their air conditioner on.”

For now, however, most Waymo cabs have a safety driver at the wheel to take over in any situation that might stump the car. There haven’t been any run-ins between local police and a human-free driverless car yet, Burdick says. When that day comes, says Matthew Schwall, head of field safety at Waymo, the police can get in touch with the company’s support team by either calling a 24-hour hotline or pushing the minivan’s help button above the second row of seats. At that point, Waymo’s remote staff can’t take direct control of the vehicle, but they can reroute it—if, for instance, the police want it to move to the side of the roadway after a collision.

Someday Your Self-Driving Car Will Pull Over for Police

Michigan state trooper Ken Monroe took Ford engineers on ride-alongs around Flint last summer. The engineers were especially curious about what he wanted drivers to do as he came up behind them with lights flashing, and how those responses differed depending on whether he was pulling over a car or trying to get past. 

“While I was responding to an emergency, they said, ‘OK, you’re approaching this vehicle here. What is the best-case scenario that you can find for that vehicle to do?’ ” They spoke at length, Monroe says, about how an autonomous vehicle could recognize when it was being pulled over. “The biggest cue that we came up with was just the length of time that the police vehicle was behind the AV.” 

In addition to its testing in Miami and Washington, Ford has been working with police in Michigan for nearly two years as part of preparations for the rollout of autonomous ride-hailing and delivery cars scheduled for 2021. Two years ago, a few dozen troopers from the Michigan State Police came to its offices in Dearborn to talk about its plans. “We emphasized that these are not going to be privately owned vehicles,” says Colm Boran, head of autonomous vehicle systems engineering at Ford. “That instantly helped to relieve some of their concerns.”

Teaching autonomous cars to pull to the right is a relatively straightforward task. The point of the lights and sirens, after all, is to be noticed from far away. “If it’s salient to a human, it’s probably salient to a machine,” says Zachary Doerzaph, director of the Center for Advanced Automotive Research at the Virginia Tech Transportation Institute. The greater challenges come when police and other first responders are outside their vehicles: “It’s all of these other cases where that last 10 percent of development could take the majority of the time.” Doerzaph’s team is researching such scenarios for a group of automakers, but he can’t yet talk about its findings. 

The jargon often used for these atypical moments is “edge cases,” but the term belies the extent of the challenge, Doerzaph says. At any given moment there are thousands of construction zones, crash sites and cops standing in intersections all across the country. The cues humans use to recognize them are subtle and varied. Humans also recognize basic hand signals and, perhaps most important for the police, acknowledge instructions with eye contact or a nod. 

It might prove necessary, as self-driving researchers try to replicate these subtle interactions, to create new modes of communication between cars and police. In theory, when Trooper Monroe gets out of his patrol car on the expressway in Michigan, he could, with a couple of taps on a handheld device, instruct all AVs in the area to steer clear. These kinds of solutions, while technologically elegant, present a host of logistic and legal hurdles. 

Inrix Inc., a Washington state-based startup that specializes in digital traffic and parking information, has started offering software to cities that allows them to enter traffic rules and roadway markers into the high-definition maps used by AV developers. City officials can mark the locations of stop signs, crosswalks, bike lanes and so on, and when an AV pings the navigation software to map a route, it will get the rules and restrictions for its trip. Boston, Las Vegas, Austin and four other cities are currently using the service, called AV Road Rules. 

Someday Your Self-Driving Car Will Pull Over for Police

The maps can be updated constantly. If roadwork blocks a lane, a city can mark the change. Inrix is working on making it possible for police to be able to update the map instantly from their cars. “That’s something that we’ve heard that there is interest in, and we are exploring how could we turn that hypothetical functionality into a real tool,” says Avery Ash, head of autonomous mobility at Inrix. 

Once the AV industry solves the day-to-day traffic stops, accident scenes and roadwork, a long list of true “edge cases” awaits. “What if it’s a terrorist suspect? What if I ordered the car and just throw a backpack in it, and then tell the car to go wherever and then blow it up?” asks Lieutenant Jaeger in Mountain View, who’s been working with Waymo engineers since the company was a self-driving car project inside Google. 

The good news for the industry is that cities, cops and automakers are all motivated to find answers because they all agree that the status quo is unacceptable. More than 37,000 people lose their lives in motor vehicle crashes every year, and the overwhelming majority of collisions are due to human error. Police are some of the main witnesses to this carnage and sometimes its victims. Cars that could detect their sirens from miles away and reliably follow the rules would be a welcome change. 

“The human driver is just not predictable,” says Monroe, the state trooper. “It’s very, very difficult.” 

Schwall at Waymo says that when he holds training sessions with police—showing them how the company’s fleet of vans works and letting them get inside—he often hears the same question: “They ask when they can have a self-driving police car.” 

--With assistance from Dana Hull.

To contact the editor responsible for this story: Aaron Rutkoff at arutkoff@bloomberg.net, Craig Trudell

©2019 Bloomberg L.P.