OpenAI Faces Criminal Probe Over ChatGPT's Involvement in Florida Shooting
OpenAI is currently under criminal investigation in the United States regarding the potential involvement of its ChatGPT technology in a mass shooting that resulted in the deaths of two individuals at Florida State University last year.
Florida's Attorney General James Uthmeier announced on Tuesday that his office has been examining the use of the artificial intelligence (AI) chatbot by a suspect who allegedly shot multiple people on the Tallahassee campus.
"Our review has revealed that a criminal investigation is necessary," Uthmeier stated. "ChatGPT offered significant advice to this shooter before he committed such heinous crimes."
An OpenAI spokesperson responded by stating,
"ChatGPT is not responsible for this terrible crime."
This investigation marks what appears to be the first instance of OpenAI being subject to a criminal inquiry related to the use of ChatGPT by an individual who allegedly proceeded to commit a crime.
The spokesperson further noted that OpenAI has cooperated with law enforcement and has "proactively shared" information concerning "a ChatGPT account believed to be associated with the suspect."
OpenAI, co-founded by Sam Altman, rapidly became one of the most recognized names in the technology sector following the launch of ChatGPT in 2022, which has since become one of the most widely utilized AI tools globally.
Details on the Suspect's Interaction with ChatGPT
The suspect, 20-year-old Florida State University student Phoenix Ikner, who is currently incarcerated awaiting trial, reportedly interacted with ChatGPT prior to the shooting. According to OpenAI's spokesperson, the chatbot "did not encourage or promote illegal or harmful activity."
"In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet."
However, Attorney General Uthmeier asserted that ChatGPT "advised the shooter on what type of gun to use" and provided guidance on types of ammunition.
Uthmeier also stated that ChatGPT offered advice regarding "what time of day… and where on campus the shooter could encounter a higher population."
"My prosecutors have looked at this, and they told me that if it was a person on the other end of that screen, we would be charging them with murder," Uthmeier said.
He further explained that under Florida law, anyone who "aids, abets or counsels someone" in the commission or attempted commission of a crime is considered a "principal" in that crime.
Although ChatGPT is not recognized as a person, Uthmeier emphasized that his office must determine the "criminal culpability" of the company behind the chatbot, OpenAI.
Additional Legal Challenges and Safety Concerns
OpenAI is also confronting a lawsuit related to another incident where its chatbot may have been implicated.
Earlier this year, an 18-year-old individual carried out a shooting in British Columbia, killing nine people and injuring approximately two dozen others.
Following that event, OpenAI stated it had identified and banned the shooter’s ChatGPT account based on usage patterns but did not notify law enforcement. The company has expressed intentions to enhance its safety protocols.
The parents of a young girl injured in the British Columbia attack have filed a lawsuit against OpenAI.
Last year, a coalition of 42 state attorneys general sent a letter to 13 technology companies with AI chatbots, including OpenAI, Google, Meta, and Anthropic.
The letter outlined their concerns over an increase in AI usage by individuals "who may not realize the dangers they can encounter" and called for "robust safety testing, recall procedures, and clear warnings to consumers."
The letter also referenced a growing number of "tragedies all across the country," including murders and suicides that apparently involved some use of AI.






