Family of Tumbler Ridge shooting victim files lawsuit against OpenAI
A family from Tumbler Ridge, British Columbia, Canada, whose daughter was critically injured in a mass shooting on February 10, has filed a civil lawsuit against OpenAI, alleging the technology company failed to alert authorities about the attacker’s activity on ChatGPT before the tragedy, as per foreign media reports.
The lawsuit follows the Feb. 10 shooting at Tumbler Ridge Secondary School, where eight people were killed, and two others were wounded. Among those injured was 12-year-old Maya Gebala.
According to her parents, Cia Edmonds and David Gebala, Maya was attempting to lock the library door to protect other students when she was shot after the attacker opened fire inside the school. The girl was struck in the neck and in the head, just above her left eye.
Authorities say the suspected shooter, Jesse VanRootselaar, had gone to the school after killing her mother, Jennifer Strang, and her 11-year-old half-brother, Emmett Jacobs, at their family home in northeastern British Columbia.
Maya Gebala remains hospitalised at BC Children’s Hospital. In an update on March 7, her mother said doctors had removed her breathing tube and that the girl was continuing to fight through her injuries and was beginning to look more like herself.
The lawsuit comes after OpenAI acknowledged that it did not alert police last summer about the shooter’s activity on ChatGPT.
On February 26, the company said it would strengthen its safety procedures, including improving systems for referring potential threats to law enforcement and identifying repeat policy violators. OpenAI said it had disabled VanRootselaar’s account in June because of “violent” activity.
In a statement, the company said it later discovered a second ChatGPT account linked to VanRootselaar’s name after the shooting, despite safeguards designed to flag repeat offenders.
OpenAI ultimately notified the Royal Canadian Mounted Police about the shooter’s ChatGPT activity only after the mass shooting on February 10.
Last Thursday, David Eby, the Premier of British Columbia, said OpenAI chief executive Sam Altman was “prepared to apologise” to the Tumbler Ridge community.
The lawsuit, filed on behalf of Maya Gebala, her sister Dahlia Gebala and their mother Cia Edmonds, seeks to determine how the shooting occurred, establish accountability and help prevent similar attacks in the future.
The civil claim alleges that roughly a dozen employees of OpenAI identified posts made by the shooter on ChatGPT as indicating an imminent risk of serious harm.
“Approximately 12 employees of the OpenAI Defendants identified the Gun Violence ChatGPT Posts as indicating an imminent risk of serious harm to others and recommended Canadian law enforcement be informed,” the lawsuit states.
The claim further alleges that the concerns were escalated to company leadership, which instead decided to ban the shooter’s first OpenAI account and “rebuffed their employees’ request to contact Canadian law enforcement.”
The lawsuit also alleges that OpenAI failed to detect and ban the shooter’s second account, which was allegedly used to continue planning violent scenarios, including a potential mass-casualty attack.
In addition, the claim accuses the company of failing to introduce age-verification or parental-consent systems and of “knowingly and intentionally permitting ChatGPT to provide pseudo-psychological treatment to the Shooter.” It further alleges that ChatGPT provided “information, guidance and assistance” that helped the attacker plan a mass-casualty event similar to the Tumbler Ridge shooting.
OpenAI’s revenue in 2025 totalled about $20 billion.
The company has not yet been formally served with the lawsuit and has therefore not responded to the allegations, which remain unproven.
By Tamilla Hasanova







