Tumbler Ridge Families Sue OpenAI Over Mass Shooter's ChatGPT Use

Seven families affected by the February mass shooting in Tumbler Ridge, British Columbia, filed lawsuits Wednesday against OpenAI and chief executive Sam Altman in a San Francisco court, alleging that the company failed to alert authorities to the gunman's disturbing ChatGPT history despite being warned by its own automated safety systems. The combined claims will likely exceed one billion United States dollars, the American lawyer representing some of the families said in a statement.
The lawsuits, the most serious legal challenge yet against an artificial intelligence company over a real-world act of violence, allege that OpenAI's leadership received internal warnings about the shooter's account months before the rampage and chose to deactivate the account rather than notify the Royal Canadian Mounted Police or any other law-enforcement agency. The complaint further alleges that the shooter created a second account, again flagged for concerning content, and OpenAI did not act.
What the lawsuits allege
According to the filings, OpenAI's automated systems flagged the account of Jesse Van Rootselaar in June 2025 for what the company classified internally as gun violence activity and planning. A safety team reviewed the content and recommended that OpenAI management notify the authorities, but according to the complaint the company instead chose to deactivate the account. When Van Rootselaar created a new account and resumed similar conversations, the families allege that OpenAI again failed to take action that could have alerted Canadian or American authorities.
The lawsuit centres on alleged duties of care owed by OpenAI in the design and operation of its consumer products, particularly when those products are used by minors. Van Rootselaar was 18 at the time of the shooting in February but was reportedly using ChatGPT well before her 18th birthday. The case will test whether existing United States product liability and negligence frameworks apply to large language model providers in the same way they apply to manufacturers of physical products.
The shooting
The Tumbler Ridge shooting was among the deadliest in Canadian history. According to police accounts and prosecution records made public after the attack, Van Rootselaar killed her mother and her 11-year-old half-brother at their home before driving to Tumbler Ridge Secondary School, where she killed five students and a teacher with a long gun and a modified handgun. She then took her own life inside the school. The community of roughly 2,400 people in northeastern British Columbia has spent the months since the attack working with provincial trauma counsellors and federal victim services.
The shooting prompted swift federal and provincial responses. British Columbia Premier David Eby announced a multi-agency review of school safety protocols and mental health supports in rural communities. The federal government opened consultations on whether existing firearms regulations adequately address modifications of legally purchased handguns and on whether artificial intelligence-related safeguards should be added to public safety legislation.
The legal strategy
The case is being brought in California rather than in British Columbia for jurisdictional reasons. OpenAI is headquartered in San Francisco, and California courts have produced a body of case law on platform liability that the plaintiffs' lawyers consider more developed than equivalent Canadian jurisprudence. The lawsuits also name Altman personally, alleging that as chief executive he bore individual responsibility for the design choices and corporate policies that allegedly led to the failures described in the complaint.
OpenAI has not yet filed a formal response. In statements after a separate, earlier lawsuit filed by one of the affected families, the company expressed condolences and said it took the safety of its products seriously, while declining to address the specific allegations. The company has not publicly disclosed the contents of any internal safety review related to the case, and is likely to invoke trade secret protections in litigation.
Implications for Canadian regulation
For Canadian policymakers, the case crystallises a debate that has been intensifying since the federal Artificial Intelligence and Data Act stalled in the previous Parliament. The Carney government has indicated that it intends to revisit the file in the current session, with officials suggesting that a new framework will likely include both pre-deployment safety requirements for high-risk AI systems and explicit post-deployment reporting obligations for incidents that involve threats of violence.
Civil society groups including the Canadian Civil Liberties Association and the Centre for Digital Rights have urged careful drafting of any reporting obligations, citing concerns about freedom of expression and the risk that overly broad triggers could capture the kinds of dark thoughts that many people, including those who would never act on them, type into chatbots. Other observers, including parents' groups and victim advocates, have argued that the Tumbler Ridge case demonstrates the limits of relying solely on self-regulation by AI companies.
Reaction from the technology sector
The lawsuits have been met with concern across the artificial intelligence industry, where companies have been racing to expand consumer products faster than internal safety teams can keep up. Several other companies operating large language models have privately acknowledged that the Tumbler Ridge case could become a defining test of how courts and regulators apportion responsibility for AI-mediated harms.
Investors in OpenAI, including major institutional shareholders, have begun pressing the company on its content moderation, age verification, and law-enforcement liaison practices. The company is also facing what one of the families' lawyers described as waves of additional litigation, with other plaintiffs across North America preparing similar claims tied to incidents in which AI products are alleged to have played a contributing role.
What it means for British Columbia
For Tumbler Ridge itself, the lawsuits represent both a search for accountability and an additional layer of public attention that the community has at times resisted. Local officials have stressed that the families' legal action is a private matter and that the town will continue to focus on its own recovery, including through programs for surviving students and through commemorations planned for later this year.
For the broader province, the case adds to an already complicated set of debates about firearms regulation, mental health services in remote communities, and the safety of digital products used by young people. Premier David Eby has said that the province will follow the litigation closely and will consider whether additional provincial measures are warranted regardless of the outcome.
Cross-border legal complications
The decision to file in California rather than in British Columbia reflects deliberate strategic choices, but it also creates complications. Canadian plaintiffs pursuing claims in United States courts must navigate jurisdictional questions, evidentiary requirements that differ from Canadian norms, and the possibility that judgments obtained in California will need to be enforced separately if any settlement involves operations on either side of the border.
The case will also have indirect implications for Canadian privacy and consumer protection law. Canadian courts have not yet developed a substantial body of case law on AI-related harms, and the outcome of the California litigation could influence the evolution of Canadian jurisprudence on related questions. Provincial privacy commissioners and consumer protection regulators have been monitoring developments in artificial intelligence governance, with several considering more proactive interventions.
The mental health policy dimension
The Tumbler Ridge case has also reopened questions about access to mental health services in rural and remote communities. Federal and provincial mental health funding has expanded substantially over the past several years, but service availability in northeastern British Columbia and similar regions remains limited. Provincial officials have indicated that the case has accelerated discussions about how to deliver crisis support and ongoing care in communities far from major urban centres.
Local advocates in Tumbler Ridge and similar communities have argued that improved mental health services would address only part of the broader challenge. Firearms availability, family supports, school-based identification programs, and the role of digital products all factor into the complex set of conditions that contribute to incidents of mass violence. The Carney government has signalled openness to a comprehensive review, although the political and policy challenges of any such effort would be substantial.
The impact on AI product design
Beyond the legal proceedings themselves, the Tumbler Ridge case has prompted broader internal discussions at AI companies about product design, safety review processes, and the relationship between automated flagging systems and human review. Several major AI providers have indicated that they are revisiting their internal protocols for handling content that suggests planning of violence, including the question of when and how to engage with law enforcement.
The technical challenges are substantial. AI systems generate vast volumes of content daily, and distinguishing between users discussing dark themes for legitimate reasons, including creative writing, journalism, or personal processing, and users with genuine intent to commit harm requires both sophisticated technical tools and human judgment. The legal liability environment that emerges from cases like Tumbler Ridge will shape how AI companies make these difficult judgments.
What's next
The California court is expected to set an initial schedule in the coming weeks. OpenAI is likely to file motions to dismiss, contesting both the merits of the negligence claims and the appropriateness of California as the venue. If the case survives those preliminary motions, the discovery phase could surface internal documents about OpenAI's safety review processes that have so far been confidential.
Federal officials in Ottawa, meanwhile, are weighing whether to expedite the long-delayed AI safety legislation, with some Liberal members of Parliament suggesting that the Tumbler Ridge case may provide the political momentum required to move past the impasse that derailed the previous bill. Any new framework will need to balance the competing imperatives of innovation, public safety, and individual privacy in ways that satisfy both industry and the families who have been pushed into the role of reluctant policy advocates.
The families themselves have indicated that their pursuit of accountability through the courts is intended to produce changes in how AI products are designed and operated, not solely to secure financial compensation. Several family members have spoken publicly about their hope that the case will lead to safer products and to a stronger framework for protecting other potential victims from similar harms in the future.
Spotted an issue with this article?
Have something to say about this story?
Write a letter to the editorRelated Stories

Canada Opens Applications for AI Sovereign Compute Infrastructure Program in Bid to Build Domestic AI Capacity
13h ago
Carney Government Prepares Reintroduction of Artificial Intelligence and Data Act After Years of Stalled Progress
14h ago