
Most strokes are what are known as “ischemic,” or resulting in a blood clot blocking blood flow to the brain. This clot stops blood from reaching the brain, which usually carries precious oxygen. Without oxygen, vital tissues, including the brain, begin to die. A 10 minute delay in treatment for a stroke could result in the death of roughly 19 million neurons – this loss in brain matter is why many stroke victims face facial sagging from partial paralysis or dips in cognative ability.
On Sunday, March 1, 2026, one of autonomous taxi company Waymo’s vehicles was filmed blocking an ambulance in Downtown Austin following the tragic 6th street shooting on Sunday, March 1, 2026. The vehicle blocked emergency traffic for 1 to 2 minutes, and law enforcement told local news that the delay had no impact on treatment.
However, this incident is just one in a line of incidents that have demonstrated Waymo’s inability to regularly operate safely on city streets. In February, one of the company’s cars was filmed passing school buses that had their stop sign extended.
Combined with the recent revelation that Waymo actually uses remote workers in the Phillipines to “help guide” its driverless cars, these incidents showcase serious problems not just with Waymo, but autonomous vehicles in general, and raise major questions related to liability. While officials confirmed that this particular delay had no impact on treatment, the fact that these vehicles could malfunction in this way is seriously concerning.
Negligence, Product Liability and Owning Mistakes
In a typical car wreck, the process is straightforward: facts first, fault second, money third. The facts are established using available evidence, legal fault is determined by reconstructing what happened, and money is assigned to the injured party based on Texas’ comparative fault system.
With autonomous vehicles, however, the discussion shifts to product liability and “operational negligence.” In other words, no individual human is usually considered responsible for what can be the loss of human life, and instead blame is assigned to software decision making, hardware limitations, fleet monitoring and other such issues. There is no single federal liability standard for AV crashes. The reason this contrast is so important is that it highlights the difficulty in determining fault for AV crashes.
OpenAI and Waymo: Two Sides of the Same Coin
OpenAI has been in litigation over its possible role in the suicide of multiple teenagers since August of 2025. The chatbot has also been linked to heightened depression, withdrawel and a new conditioned called “AI psychosis.” Just like Waymo, OpenAI released a product into the world without fully understanding the impact it would have. Just like Waymo, OpenAI used the public to test its new product.
However, when those injured seek compensation, the companies involved obfuscate the blame by pointing to traditional product liability cases and skirt responsibility by blaming the end user. There are many similar questions between Waymo and OpenAI:
- What happens if the bot tells me to do something harmful?
- What happens if the car decides to suddenly brake?
- What happens if the bot provides the wrong medical advice, resulting in a worsening injury?
- What happens if the car blocks EMS vehicles, worsening the caller’s injury?
Conclusion
The issue here is not that autonomous vehicles make mistakes. Humans are not perfect drivers, and no product is 100% reliable. However, the fact is that human lives are put directly at risk when vehicles like Waymo, Tesla and other Full Self Driving (FSD) systems are tested on public streets. And when these vehicles cause crashes or delay traffic (including emergency traffic), the response is usually just a slap on the wrist to the company. Human drivers (who have not consented) are used as a test group for autonomous vehicles and regularly injured as a result.
Personal injury cases are already complicated. They become even more complicated when we have to track down the responsible company and then litigate partial fault between hardware limitations and software bugs. AV cases become harder to reach full compensation as well, given that law firms now have to hire digital forensics experts, programming experts, and others knowledgeable enough to speak on the problems of autonomous vehicles.
If you have been injured in a Autonomous vehicle accident involving Waymo, Tesla, Nuro, Wayve, or any other company offering FSD or AV taxi services, you need to contact an experienced personal injury attorney today. Hilda Sibrian has served the clients of Houston, Texas for over 22 years for auto accidents, semi-truck accidents and other injuries suffered as a pedestrian or on the road. Hilda Sibrian serves the Houston metropolitan area, including Sugar Land, Missouri City, La Porte, Beaumont, Pasadena, The Woodlands, The Heights, Bellaire, Kingwood, Baytown and of course Houston proper.
Call our office today or fill out our online contact form for a free consultation.