Autonomous cars require human trust

In Robot Cars We Trust or Do We?

One hundred years ago, society worried about what would happen when the traffic light failed and the electricity went down. One hundred years later, we are asking the same question to programmers of autonomous vehicles—only now, the answer involves gigabytes of data and millisecond-latency pings.

To understand this, we take a quick look back at the history of the traffic “robot’.”

The traffic light is more than a hundred years old and since 1917 South Africans have called them “robots.” That’s because the public was suspicious of these humanoid devices that replaced traffic officers standing at intersections, wearing white gloves for visibility. Today we meet up with a new “robot” in transportation, the self-driving autonomous car. There’s a link between these two robots: the tech only works when the public trusts.

Nearly a century after South Africa embraced its “robots” a new cyborg tale unfolded in San Francisco. On December 20, 2025, a fire broke out at an electrical substation. It plunged roughly one-third of the city—about 130,000 customers—into darkness. It also plunged autonomous cars into “brick-hood.” As the grid failed, the city’s traffic signals stopped working, and so did the autonomous cars, these modern “robots.”

Cell phone/tower congestion kept the vehicles from being able to call in/reach their human handlers. So,  instead of smoothly improvising, the driverless cars defaulted to their hyper-conservative safety mode, coming to a dead stop in the middle of San Francisco intersections. It was better than stopping on a railroad track with a train approaching. No one was hurt, but trust was tarnished.

For a few days after the stoppage, public officials spoke of slowing down or further regulating the autonomous vehicles. Trips stopped for about a day. Not surprisingly, the fervor blew over, and the autonomous service was soon up and running again.  Riders never really lost their trust in the modern robot.

In 1917 it was harder. Engineers had to convince a skeptical public that a brand new, never tried-out series of colored lights could do a better job than a traffic officer’s hand signals.

Trust is at the foundation of “robotic” change. Whether you are travelling through the red/green/light signal we take for granted today, or driving into the future in an autonomous vehicle, either requires a level of faith in the integrity of the electric grid, the programming, and the integration of sensors, cameras, and algorithms. 

However, like the robotic traffic signals, autonomous cars are safer than humans at intersections.. Robotic cars do not (typically) run a yellow or forget to check for pedestrians on the right. They don’t creep over the stop line. According to a now well-known media story written by an emergency room surgeon, autonomous cars produce 96% lower injury-lowering crashes at intersections.

Significantly, robotic cars “read” traffic signals differently than drivers.  We consult Waymo’s postings to elucidate how these vehicles are programmed to handle the red/green/yellow.

Robot Waymo, Robot Signal Meet:

  • 360-Degree Awareness:  Cameras see traffic light colors, while LiDAR paints a 3D picture of the surroundings to confirm the environment.
  • Contextual Understanding: Autonomous vehicles  do not just see a green light; it interprets it within the context of the scene, identifying nearby pedestrians, cyclists, and other cars.
  • Right on Red: Waymo is programmed to identify when a right turn on red is permissible, coming to a complete stop first, then “nudging” forward to gain a better view of traffic before completing the turn.
  • Finally….Handling Broken Signals: If a light is out or flashing, Waymo is designed to interpret human gestures, such as a traffic officer waving cars through an intersection. That’s according to this 2019 article in CNET.

The latter issue- anticipating broken signals- is at the core, it is the bridge between the South African robot of 1917 and the Waymo car of 2026. We  learned on December 20th in San Francisco, that communications fail, there are not enough (human)  traffic officers on the streets, and/or the vehicles are not actually listening/watching for that gesture. (Presumably, unlike an unruly 5 year old, they have not yet learned to disregard human commands!) It remains to be seen how the robot vehicles will respond when they encounter the next electrical outage. It will happen. But between then and now, they will steer-clear of many problems and save lives at intersection-crossings. Trust is not yet a green light , but it gets closer.


Posted

in

,

by

Tags: