In a significant legal move following the tragic Tumbler Ridge shooting, the family of a 12-year-old girl who sustained life-altering injuries has initiated a civil lawsuit against OpenAI. Filed on Monday in the Supreme Court of British Columbia by Cia Edmonds on behalf of her daughters, Maya and Dahlia Gebala, the claim raises serious questions about the tech giant’s responsibility in not alerting authorities to potential threats indicated by interactions with its ChatGPT chatbot.
Claims of Foreknowledge and Negligence
The lawsuit asserts that OpenAI was aware of the shooter’s violent inclinations prior to the February 10 incident yet failed to notify law enforcement. Citing various media reports and public statements, the claim argues that OpenAI had previously identified alarming exchanges between the shooter and its chatbot, which were left unreported until it was too late.
“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada,” remarked the law firm Rice Parsons Leoni & Elliott LLP, representing the family.
Maya Gebala was critically injured in the shooting, suffering three gunshot wounds, including one that penetrated her skull. The ramifications of her injuries are severe, encompassing a catastrophic traumatic brain injury, permanent disabilities, and enduring psychological conditions such as PTSD. She is currently receiving treatment at BC Children’s Hospital, with her long-term recovery uncertain.
The Impact on Family Members
While Maya sustained the brunt of the physical injuries, her sister Dahlia was present during the shooting and has since been grappling with PTSD, anxiety, and depression. Their mother, Cia Edmonds, has reported similar mental health struggles, experiencing pain and a diminished quality of life as a result of the traumatic events.

The civil claim highlights the broader implications of this incident, not only for the victims but also for their families. The psychological scars left by such violence extend well beyond the immediate physical injuries, affecting the entire family unit.
OpenAI’s Response and Broader Implications
OpenAI has yet to respond to inquiries regarding the lawsuit. However, following the incident, the company has indicated it is taking steps to enhance its safety protocols. Reports suggest that an internal review revealed that while employees raised concerns about the shooter’s discussions of violence, no action was taken to alert Canadian authorities at the time.
B.C. Premier David Eby has met with OpenAI CEO Sam Altman, who is reportedly willing to express regret to the affected families. The lawsuit further accuses OpenAI of hastily releasing its AI technology to a global audience without conducting adequate safety assessments, potentially resulting in “hazardous defects.”
The plaintiffs are seeking punitive damages, asserting that OpenAI’s actions are not only morally reprehensible but pose significant risks to the community at large. The allegations presented in the civil claim remain untested in court, leaving the outcome uncertain.
Why it Matters
This lawsuit underscores a critical intersection between technology and public safety, raising profound questions about accountability in the age of artificial intelligence. As society grapples with the implications of rapidly evolving technologies, the outcome of this case may set important precedents regarding the responsibilities of tech companies in preventing violence. It highlights the urgent need for regulatory frameworks to ensure that innovations do not come at the cost of human safety and well-being.
