Crime USA

Robotaxi under scrutiny after child struck near California school

Robotaxi under scrutiny after child struck near California school
Source: Getty Images
  • Published February 2, 2026

 

US auto safety regulators are opening a fresh investigation after a Waymo self-driving vehicle hit a child near an elementary school in Santa Monica, adding to mounting questions about how robotaxis behave in complex, high-risk environments.

The United States National Highway Traffic Safety Administration (NHTSA) said on Thursday that the incident took place on January 23 during routine school drop-off hours. According to the agency, the child ran into the street from behind a double-parked SUV and was struck by the autonomous vehicle operated by Waymo. The area was crowded, with other children present, a crossing guard on duty and several vehicles stopped illegally nearby.

The crash comes at a moment when self-driving cars are expanding rapidly across US cities. It also lands just days before a previously scheduled February 4 hearing of the US Senate Commerce Committee on autonomous vehicles, where Waymo’s chief safety officer, Mauricio Pena, is set to testify. The National Transportation Safety Board said it will also investigate the Santa Monica incident.

In a blog post published on Thursday, Waymo said it would cooperate fully with regulators. The company said the child “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path”. According to Waymo, the vehicle detected the child immediately after they emerged from behind the stopped car and braked sharply, slowing from about 17 miles per hour (27km/h) to under 6mph (10km/h) before making contact.

NHTSA said it is launching a preliminary evaluation to assess whether the autonomous vehicle exercised appropriate caution given its proximity to a school, the time of day and the presence of young pedestrians and other vulnerable road users.

The agency said it will examine the vehicle’s “intended behaviour in school zones and neighbouring areas, especially during normal school pick-up and drop-off times, including but not limited to its adherence to posted speed limits”. Regulators will also “investigate Waymo’s post-impact response”.

Waymo said its internal modelling suggested that a fully attentive human driver in the same situation would have hit the pedestrian at roughly 14mph (23km/h). After the collision, the company said, the child stood up, walked to the pavement and emergency services were called.

“The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo said.

The Santa Monica case is not an isolated concern. On the same day, the NTSB opened a separate investigation after Waymo robotaxis in Austin, Texas, illegally drove past stopped school buses at least 19 times since the start of the school year. In December, Waymo recalled more than 3,000 vehicles to update software linked to that problem, after regulators warned the behaviour increased the risk of crashes.

Waymo has said there were no collisions in the Austin incidents. However, the Austin Independent School District reported that five violations occurred in November after software updates were rolled out. The district asked Waymo to suspend operations near schools during pick-up and drop-off times, but in December told Reuters that the company had refused.

More broadly, Waymo has faced scrutiny over other incidents as well. In late December, one of its vehicles crushed a cat in San Francisco, followed by a similar incident involving a dog about a month later.

 

Wyoming Star Staff

Wyoming Star publishes letters, opinions, and tips submissions as a public service. The content does not necessarily reflect the opinions of Wyoming Star or its employees. Letters to the editor and tips can be submitted via email at our Contact Us section.