Uber Car's Failure to Brake Puts Focus on Sensor

Experts reviewing video are surprised vehicle didn't seem to detect victim

TEMPE, Ariz.—The roads north of Arizona State University are in many ways ideal for testing self-driving cars, with wide, clearly marked lanes and minimal traffic late at night, when the vehicle's laser sensors work best.
 The optimal conditions make it especially troubling that an Uber Technologies Inc. self-driving car plowed straight into and killed a pedestrian walking across a street here at night, without appearing to brake or veer, according to a video from the vehicle released by police Wednesday.
 The incident raises questions about whether the sensors that serve as a self-driving vehicle's eyes are ready for the complexities of city life.
 Several autonomous-vehicle experts who reviewed the 21- second video expressed surprise that the car never seemed to detect the 49-yearold woman as she pushed a bicycle across multiple lanes of traffic. They say it is clear the system failed, though some pointed to hazards that can confuse a robot's brain and eyes, such as shrubs that encroach on the roadway or the woman's bike with bags on it.
 Uber called the video “disturbing and heartbreaking” and said it has suspended testing while cooperating with investigators. Some experts said the accident might have been more understandable had a person been driving the car. The pedestrian, Elaine Herzberg, was crossing the street in the dark a few hundred feet from a crosswalk, and she seems to come out of nowhere in the video.
 But an autonomous car is designed to “see” objects in the dark from hundreds of feet away. The promise of autonomous- car technology is that robot eyes are supposed to detect things humans can't, and anticipate and react faster to reduce accidents like this one.
 “She absolutely should have been detected by their system,” Missy Cummings, a professor of mechanical engineering and material science at Duke University, said Wednesday by email.
 Making matters worse, the human safety operator in the car appeared to be distracted, according to the video, and was looking down for five seconds just before impact.
 An analysis of the accident's circumstances points to possible shortcomings of the vision technology that dozens of auto makers and tech giants have hailed as a fundamental part of bringing driverless cars to the road.
 The collision occurred as the car was headed northbound on North Mill Avenue, which rapidly widens from two lanes to five as it approaches a vast intersection.
 Police on Tuesday wouldn't say precisely where the incident occurred, but the video suggests it happened as the avenue takes a slight curve and then opens into four lanes.
 The video confirms the car is in the right lane, possibly giving it a clear view of the other lanes where the pedestrian would have been coming from.
 A median with shrubs lines the left side of the road, while the far right lane abuts a bike path and a wide sidewalk lined with some shrubs.
 On Monday at around 10 p.m., the same time as the crash the night before, the area was quiet with little traffic, though several pedestrians were walking around, including some pushing bicycles with bags.
 While police said the speed limit on the stretch where the crash occurred is 35 miles per hour, a nearby sign states 45 mph. The Uber vehicle is believed to have been traveling at about 40, the police said.
 Most companies developing autonomous cars, including Uber, use cameras and laser and radar sensors to gather data for the onboard artificial intelligence system to translate. That information then is used to predict what might happen and how the car should respond.
 Because it was night, Uber's system was likely relying heavily on laser sensors, called lidar, which emit beams that bounce off objects to paint a picture of the world, outside experts say. The bulky box of sensors, mounted on the car's roof, can see in 360 degrees and detect objects hundreds of feet away.
 But lidar sometimes struggles to create a complete picture, especially if only a few laser points are bouncing off an object. Snowflakes, for example, can appear to be menacing solid obstacles. It works best at night, when there is no sun interference.
 Some driverless-car experts say Uber's system could have been confused by the shrub alongside the roadway and may not have discerned the difference between shrub branches and a pedestrian.


Eyes on the Road
Autonomous cars use an array of cameras, radar, lidar and sensors to guide them.


Intersection of North Mill Avenue and Washington Street in Tempe, Ariz.


Sources: Srikanth Saripalli, Texas A&M Uni versity (sensor details); Uber (sensor locations);WSJ analysis of Google Maps (crash scene);


BY TIM HIGGINS

Add new comment