The family of a 12-year-old girl who sustained life-altering injuries in the recent Tumbler Ridge school shooting has initiated legal action against OpenAI. Filed in the British Columbia Supreme Court on Monday, the civil claim seeks accountability from the tech giant, alleging it failed to alert authorities about the shooter’s alarming intentions despite having prior knowledge.
Legal Action Stemming from Tragedy
Cia Edmonds, representing her daughters, Maya and Dahlia Gebala, contends that OpenAI had sufficient awareness of the shooter’s violent behaviour, which was communicated through interactions with its ChatGPT chatbot. The claim highlights that in the months preceding the February 10 incident, OpenAI had identified concerning exchanges but did not escalate the matter to law enforcement.
The legal team, Rice Parsons Leoni & Elliott LLP, released a statement emphasising the objectives of the lawsuit: “The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada.”
The Impact on Victims
Maya suffered severe injuries in the attack, being shot three times, including a bullet that entered her head above the left eye. The civil suit details the catastrophic effects of her injuries, including a traumatic brain injury and various physical and psychological disabilities. She is currently receiving treatment at BC Children’s Hospital, with her long-term recovery uncertain.

Dahlia, although physically unharmed, has been deeply affected, experiencing PTSD, anxiety, and depression as a result of the trauma. Cia Edmonds herself has not escaped the psychological toll, suffering similar emotional distress, loss of enjoyment in life, and financial hardships.
OpenAI’s Responses and Controversies
In the wake of the shooting, OpenAI has faced intense scrutiny. Reports indicate that employees had engaged in discussions about the shooter’s interactions with ChatGPT, with some advocating for notifying Canadian authorities. However, the company ultimately opted not to take action at that time.
Since the incident, OpenAI has announced modifications to its systems, claiming that future alarming interactions would now be flagged for law enforcement intervention. British Columbia Premier David Eby revealed that OpenAI’s CEO, Sam Altman, is willing to express regret to the families affected by the tragedy.
The civil claim also criticises OpenAI for hastily launching its technology without adequate safety assessments, alleging it knowingly released a product with “hazardous defects.” The plaintiffs are seeking punitive damages, asserting that the company’s actions are not just legally questionable but morally indefensible.
Why it Matters
This lawsuit raises critical questions about the responsibilities of tech companies in monitoring and managing the potential misuse of their products. As communities grapple with the aftermath of the Tumbler Ridge shooting, this case could set significant legal precedents regarding accountability in the tech industry. It underscores the urgent need for robust safety measures and ethical considerations in the development and deployment of AI technologies, particularly those that engage directly with the public. The outcome may influence future legislation and corporate practices, significantly impacting how technology interfaces with society and public safety.
