A chilling incident unfolded in Santa Monica, leaving many concerned about the safety of autonomous vehicles. A Waymo robotaxi collided with a child near an elementary school, sparking a crucial investigation.
On January 23rd, Waymo reported the accident to the National Highway Traffic Safety Administration (NHTSA). The company stated that a child, whose details are being kept private, suffered minor injuries. But here's where it gets controversial—the NHTSA has initiated an investigation, and Waymo, in a blog post, pledged full cooperation.
Waymo's account of the incident reveals that their robotaxi was traveling at 6 miles per hour after braking from 17 mph when the child suddenly appeared from behind an SUV, directly in the vehicle's path. Waymo claims its technology detected the child as soon as they emerged. The company further stated that the pedestrian stood up, walked to the sidewalk, and emergency services were called. The vehicle remained at the scene until law enforcement arrived.
This incident adds to Waymo's recent troubles, as they already face dual investigations for their robotaxis illegally passing school buses. The NHTSA and the National Transportation Safety Board are scrutinizing these incidents, with the NHTSA focusing on the robotaxi's caution near schools and its ability to detect vulnerable road users.
Interestingly, Waymo's blog post suggests that a human driver might have fared worse, claiming their model indicates a human would have hit the pedestrian at a higher speed. However, specific details of this crash analysis remain undisclosed.
As the debate over autonomous vehicle safety intensifies, incidents like these raise important questions. Are these vehicles truly ready for our roads? How can we ensure the safety of pedestrians, especially children? And what level of accountability should companies like Waymo have? Share your thoughts below, and let's explore these critical aspects of our evolving transportation landscape.