A civil lawsuit has been lodged by the family of a 12-year-old girl critically injured in the Tumbler Ridge school shooting, targeting OpenAI in a bid for accountability. The claim, filed in the British Columbia Supreme Court, alleges that the tech giant failed to alert authorities to the shooter’s violent intentions, despite having flagged concerning interactions with the ChatGPT chatbot months before the tragic event.
Details of the Incident
On February 10, 2023, Tumbler Ridge was shaken by a horrific shooting that left eight individuals dead, including six children. The shooter, Jesse Van Rootselaar, subsequently took her own life. The claim highlights that, during interactions with ChatGPT in June 2022, the shooter expressed violent scenarios, which were reportedly identified by an automated review system within OpenAI. The lawsuit asserts that staff members debated whether to notify law enforcement but ultimately did not take action.
Cia Edmonds, representing her daughters Maya and Dahlia Gebala, has launched this legal action to uncover the full circumstances surrounding the shooting. “The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada,” her legal team, Rice Parsons Leoni & Elliott LLP, stated.
The Impact on Victims
Maya Gebala, who sustained life-threatening injuries, was shot three times, suffering a traumatic brain injury along with physical and cognitive disabilities. The civil claim details her ongoing struggles, including right-sided hemiplegia, scarring, and mental health challenges, such as depression and PTSD. Currently, she remains in BC Children’s Hospital, with her long-term outlook uncertain.

While Dahlia was not physically harmed, the psychological aftermath has been severe, leading to PTSD, anxiety, and sleep disturbances. Their mother, Cia Edmonds, also experiences similar mental health issues, alongside the emotional toll and loss of income resulting from this tragedy. The claim reflects the profound and enduring impact on the family’s lives.
OpenAI’s Response and Future Implications
OpenAI has not publicly responded to requests for comment regarding the lawsuit. However, following the incident, OpenAI announced changes aimed at improving their response protocols should similar interactions occur in the future. British Columbia Premier David Eby has indicated that OpenAI’s CEO, Sam Altman, is prepared to offer an apology to the families affected by the shooting.
The lawsuit criticises OpenAI for allegedly rushing its large language model to market without sufficient safety evaluations, claiming it was aware of its “hazardous defects.” The plaintiffs are seeking punitive damages, asserting that the company’s actions are morally indefensible and have caused significant harm to the victims and the broader community.
Why it Matters
This legal action raises critical questions about the responsibilities of technology companies in safeguarding public safety. As society increasingly integrates artificial intelligence into daily life, the case may set significant precedents regarding accountability and the ethical obligations of tech firms when their products are linked to real-world violence. It underscores the urgent need for comprehensive regulations and safety protocols in the AI sector to prevent similar tragedies in the future.
