The family of a 12-year-old girl critically injured in the tragic Tumbler Ridge school shooting has initiated a civil lawsuit against OpenAI. Filed in the British Columbia Supreme Court, the claim, submitted by Cia Edmonds on behalf of her daughters, Maya and Dahlia Gebala, alleges that the tech giant failed to alert law enforcement about the shooter’s violent intentions, despite having prior knowledge.
Allegations of Negligence and Accountability
The civil claim highlights that OpenAI was aware of concerning interactions involving the shooter and its ChatGPT chatbot months before the incident on February 10. Reports indicate that these interactions were flagged internally but were not reported to the authorities. The lawsuit seeks to uncover the truth behind the shooting, calling for accountability from OpenAI and aiming to prevent future tragedies.
“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada,” stated the family’s legal representatives at Rice Parsons Leoni & Elliott LLP.
The Impact on the Victims
Maya Gebala, who was shot three times, suffered devastating injuries, including a traumatic brain injury and permanent disabilities. The civil claim details her injuries, which include right-sided hemiplegia and significant psychological effects such as PTSD and depression. She remains in critical condition at BC Children’s Hospital, with her future uncertain.

Dahlia, who was present during the shooting but not physically harmed, has also experienced severe psychological repercussions, including anxiety and sleep disturbances. Their mother, Cia Edmonds, has reported similar mental health struggles, significantly impacting her quality of life and ability to work.
OpenAI’s Response and Changes Made
OpenAI has faced mounting criticism since the revelations emerged. Reports suggest that the shooter used ChatGPT in June 2022, discussing violent scenarios that raised alarms among some employees who debated notifying law enforcement. Ultimately, no action was taken to inform authorities of the potential threat.
In light of the tragedy, OpenAI has stated that it is implementing measures to ensure that similar alarming interactions are flagged for law enforcement in the future. British Columbia Premier David Eby has indicated that OpenAI CEO Sam Altman is prepared to apologise to the families affected by the shooting.
The lawsuit also contends that OpenAI hastily released its language model to the public without sufficient safety assessments, thus exposing the community to potential harm. The plaintiffs are pursuing undisclosed punitive damages, describing OpenAI’s actions as “reprehensible and morally repugnant.”
Why it Matters
This lawsuit underscores the urgent need for accountability in the tech industry, particularly concerning the deployment of artificial intelligence technologies. As the world grapples with the implications of AI, the Tumbler Ridge tragedy raises critical questions about corporate responsibility and the ethical obligations of tech companies to ensure public safety. The outcome of this case could set a precedent, influencing how AI firms manage the risks associated with their products and their role in preventing violence in society.
