In 2025, several families filed civil lawsuits against technology companies related to the use of artificial intelligence chatbots, including ChatGPT. These lawsuits allege that interactions with AI systems may have contributed to tragic outcomes involving suicide or severe self harm, particularly during periods of emotional distress.
One widely reported case involved the parents of a teenage user who died by suicide after engaging in prolonged conversations with an AI chatbot. The lawsuit alleges that the technology failed to adequately respond to expressions of vulnerability and distress and did not effectively direct the user to appropriate crisis support resources. The case raised broader legal questions about the responsibilities of technology companies when their products are used by vulnerable individuals.
At this time, these cases consist of allegations only. No court has made findings of liability or fault, and the litigation remains ongoing. Work with a trusted wrongful death attorney from our firm to learn more.
According to publicly available court filings and reporting, plaintiffs in these cases generally allege that:
These allegations form the basis for claims commonly associated with wrongful death, negligence, and product liability law.
Wrongful death claims typically arise when a death is allegedly caused by the negligence or misconduct of another party. In the context of emerging technology, families may argue that a company owed a duty of care to users and failed to meet reasonable safety standards.
Product liability claims may focus on whether a product was defectively designed, lacked adequate warnings, or failed to include reasonable safeguards. In AI related litigation, courts may examine whether the technology functioned as intended, whether risks were foreseeable, and whether adequate protections were in place.
Because artificial intelligence is a developing area of law, courts continue to evaluate how traditional legal principles apply to these technologies.
Families who have experienced a loss or serious injury sometimes seek information about whether legal remedies may be available. While every situation is unique, it may be helpful to:
An attorney can help evaluate whether the circumstances meet legal thresholds for a potential claim under applicable law.
Eligibility for legal claims depends on specific facts and jurisdictional law. Factors that may be evaluated include:
No single factor determines whether a lawsuit may proceed. Courts consider the totality of circumstances.
Is artificial intelligence legally responsible for suicide or self harm?
Artificial intelligence itself is not a legal person. Lawsuits typically focus on the companies that design, deploy, and maintain the technology. Courts evaluate whether those entities met reasonable safety and design standards.
Are these lawsuits claiming that AI caused the death?
Plaintiffs generally allege that the technology contributed to harm through design flaws or inadequate safeguards. These are allegations, not findings, and must be proven through the legal process.
Do these cases involve minors only?
While some reported cases involve minors, lawsuits may also involve adults. Courts may apply heightened scrutiny when minors or other vulnerable users are involved.
What type of law applies to these cases?
These cases often involve wrongful death, negligence, and product liability principles. Because AI law is evolving, courts may also consider emerging regulatory and policy frameworks.
Has any court ruled against an AI company in these cases yet?
As of now, no final rulings have been issued establishing liability. The cases remain in various stages of litigation.
Does using an AI platform automatically create legal liability?
No. Liability depends on specific facts, including foreseeability, design choices, warnings, and how the technology was used.
These lawsuits reflect growing scrutiny of artificial intelligence and its role in sensitive areas involving mental health and vulnerable populations. As AI adoption increases, courts, regulators, and lawmakers continue to examine appropriate safety standards and accountability measures.
This page is provided for informational and educational purposes only. It does not constitute legal advice and does not create an attorney client relationship. Laws vary by jurisdiction, and legal outcomes depend on the specific facts of each case.