Lawsuit alleges ChatGPT failed to alert authorities before Tumbler Ridge shooting

March 11, 2026 The mother of a survivor of the 2026 Tumbler Ridge mass shooting has filed a lawsuit in B.C. Supreme Court against OpenAI and its ChatGPT service, alleging the company failed to notify authorities after internal staff reportedly flagged violent planning discussions from the attacker. The lawsuit claims company moderators identified the chats as posing an imminent risk of serious harm but leadership declined to alert police.

The civil action, filed March 9 on behalf of Cia Edmonds and her daughters Maya and Dahlia Gebala, centres on alleged interactions between the shooter, Jesse Van Rootselaar, and ChatGPT in the summer of 2025. According to the claim, Van Rootselaar, then 17, used the service to discuss multiple scenarios involving gun violence over several days.

The lawsuit alleges that 12 ChatGPT monitoring staff reviewed the conversations and determined they indicated a serious threat to others. Employees reportedly recommended notifying Canadian law enforcement, but the request was rejected by company leadership, according to the filing.

Instead, the account was closed, the lawsuit states. Van Rootselaar allegedly opened another account afterward and continued discussing violent scenarios, including a potential mass casualty event similar to the attack that later occurred.

On Feb. 10, 2026, Van Rootselaar killed her mother and half-brother at their home before going to Tumbler Ridge Secondary School, where she killed five students and a teacher before taking her own life.

Maya Gebala, who was attempting to lock a library door during the attack, was critically injured and remains hospitalized with a catastrophic brain injury and paralysis on her right side, according to the claim.

The lawsuit seeks to determine whether earlier intervention could have prevented the violence. A statement from the family’s legal representatives said the action aims “to learn the whole truth about how and why the Tumbler Ridge mass shooting happened, to impose accountability, to seek redress for the harms and losses, and to prevent another mass shooting atrocity in Canada.” 

OpenAI has 35 days to respond to the lawsuit, which comes as policymakers and technology companies face growing scrutiny over how artificial intelligence systems handle violent or threatening user activity. The company’s CEO Sam Altman met with federal AI Minister Evan Solomon earlier this month and reportedly agreed to introduce safety changes.

Altman also met with B.C. Premier David Eby and, according to the premier, committed to issuing a public apology to the victims of the attack. As of March 9, no apology had been issued.

Top Stories

Related Articles

March 11, 2026 Nvidia is planning to launch an open-source platform designed to help companies deploy artificial intelligence agents to more...

March 11, 2026 Meta has acquired Moltbook, an experimental social network designed for artificial intelligence agents to interact and coordinate more...

March 10, 2026 Microsoft is introducing a new top-tier Microsoft 365 subscription called E7 that bundles its Copilot artificial intelligence more...

March 10, 2026 A London surgeon has remotely guided a robot to remove a patient’s cancerous prostate at a hospital more...

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn