Arizona woman hit by self-driving car: How driverless cars see the world around them

Driverless cars can have trouble in tunnels and on bridges, and may have difficulty dealing with heavy traffic. PHOTO: REUTERS

ARIZONA (NYTIMES) - On Sunday (March 18) night, a woman died after she was hit by a self-driving car operated by Uber in Tempe, Arizona. The car was operating autonomously, although a safety driver was behind the wheel, according to a statement from the local police.

Uber is one of many companies testing this kind of vehicle in Arizona, California and other parts of the country.

Waymo, the self-driving car company owned by Google's parent company, Alphabet, has said it is operating autonomous cars on the outskirts of Phoenix without a safety driver behind the wheel.

On Monday, Uber said it was halting tests in Tempe, Pittsburgh, Toronto and San Francisco.

Here is a brief guide to the way these cars operate.

How do these cars know where they are?

When designing these vehicles, companies like Uber and Waymo begin by building a three-dimensional map of a place.

They equip ordinary automobiles with lidar sensors - "light detection and ranging" devices that measure distances using pulses of light - and as company workers drive these cars on local roads, these expensive devices collect the information needed to build the map.

Once the map is complete, cars can use it to navigate the roads on their own. As they do, they continue to track their surroundings using lidar, and they compare what they see with what the map shows. In this way, the car gains a good idea of where it is in the world.

Lidar also alerts the cars to nearby objects, including other cars, pedestrians and bicyclists.

Is that the only important technology?

Lidar works pretty well, but it can't do everything. It provides information only about objects that are relatively close, which limits how fast cars can drive.

Its measurements are not always sharp enough to distinguish one object from another. And when multiple autonomous vehicles drive the same road, their lidar signals can interfere with one another.

Even in situations where lidar works well, these companies want backup systems in place. So most driverless cars are also equipped with a variety of other sensors.

Like what?

Cameras, radar and global positioning system antennas, the kind of GPS hardware that tells your smartphone where it is.

With the GPS antennas, companies like Uber and Waymo are providing cars with even more information about where they are. With cameras and radar sensors, they can gather additional information about nearby pedestrians, bicyclists, cars and other objects.

Cameras also provide a way to recognize traffic lights, street signs, road markings and other signals that cars need to take into account.

How do the cars use all that information?

That is the hard part. Sifting through all that data and responding to it require a system of immense complexity.

In some cases, engineers will write specific rules that define how a car should respond in a particular situation. A Waymo car, for example, is programmed to stop if it detects a red light.

But a team of engineers could never write rules for every situation a car could encounter. So companies like Waymo and Uber are beginning to rely on "machine learning" systems that can learn behavior by analyzing vast amounts of data describing the country's roadways.

Waymo now uses a system that learns to identify pedestrians by analysing thousands of photos that contain people walking or running across or near roads.

Is that the kind of thing that broke down in Tempe?

It is unclear what happened in Tempe. But these cars are designed so that if one system fails, another will kick in. In all likelihood, the Uber cars used lidar and radar as well as cameras to detect and respond to nearby objects, including pedestrians.

Self-driving cars can have difficulty duplicating the subtle, nonverbal communication that goes on between pedestrians and drivers. An autonomous vehicle, after all, can't make eye contact with someone at a crosswalk.

"It is still important to realize how hard these problems are," said Ken Goldberg, a professor at the University of California, Berkeley, who specialises in robotics.

"That is the thing that many don't understand, just because these are things humans do so effortlessly."

The crash occurred at night. Is that a problem?

These cars are designed to work at night, when some sensors can operate just as well as they can in the daytime. Some companies even argue that it is easier for these cars to operate at night.

But there are conditions that these cars are still struggling to master. They do not work as well in heavy precipitation. They can have trouble in tunnels and on bridges. And they may have difficulty dealing with heavy traffic.

Join ST's Telegram channel and get the latest breaking news delivered to you.