The family of a 12-year-old girl critically injured during a school shooting in Tumbler Ridge, British Columbia, has launched a civil lawsuit against OpenAI. The claim, submitted to the B.C. Supreme Court, seeks accountability from the technology company, alleging it had prior knowledge of the shooter’s violent intentions yet failed to alert law enforcement.
Details of the Lawsuit
The legal action was initiated on Monday by Cia Edmonds, representing herself and her daughters, Maya and Dahlia Gebala. The claim contends that OpenAI was aware of troubling interactions between the shooter and its ChatGPT chatbot months before the tragic incident on February 10. Reports indicate that these interactions raised alarms within the company but were not reported to appropriate authorities.
“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada,” stated the law firm Rice Parsons Leoni & Elliott LLP, which is acting on behalf of the family.
Impact on the Victims
Maya was shot three times at close range, resulting in severe injuries, including a traumatic brain injury and permanent disabilities. According to the claim, one bullet penetrated her skull, while others caused significant physical harm and mental health issues, including depression and post-traumatic stress disorder (PTSD). She remains hospitalised at BC Children’s Hospital, with her recovery uncertain.

In contrast, Dahlia, who was present during the shooting but physically unharmed, has been left with PTSD, anxiety, and sleep disturbances. Their mother, Cia Edmonds, is also experiencing similar psychological distress, contributing to her pain and loss of quality of life.
OpenAI’s Response and Company Changes
OpenAI has yet to respond to requests for comment regarding the lawsuit. Reports suggest that following the shooting, the company has implemented modifications to its systems that would now escalate concerning interactions to law enforcement.
B.C. Premier David Eby, who has met with OpenAI CEO Sam Altman, revealed that Altman is willing to apologise to the families affected by the tragedy. The civil claim accuses OpenAI of hastily releasing its large language model to the public without sufficient safety assessments, highlighting what the plaintiffs describe as “hazardous defects.”
Seeking Justice and Accountability
The family is pursuing undisclosed punitive damages, asserting that OpenAI’s actions—or lack thereof—are morally objectionable and damaging to the community at large. The allegations have yet to be tested in court, and the outcome may set a significant precedent in the intersection of technology and public safety.

Why it Matters
This lawsuit raises critical questions about the responsibilities of tech companies in safeguarding public safety. With the increasing integration of AI technologies into everyday life, accountability mechanisms must be established to prevent future tragedies. The outcome of this case could influence regulations surrounding AI usage and the ethical obligations of companies to monitor and report concerning behaviour, ultimately shaping the future landscape of technology and its impact on society.