
Josh Shapiro has sued Character.AI after a chatbot falsely posed as a licensed Pennsylvania psychiatrist and offered medical advice to a state investigator.
Summary
- A Character.AI bot named “Emilie” claimed to be a licensed Pennsylvania psychiatrist and provided a fake state medical license number during a state investigation.
- The bot offered depression assessments and told an investigator its consultations were “within my remit as a Doctor.”
- Pennsylvania is seeking a preliminary injunction to bar Character.AI bots from practicing medicine without a license.
Pennsylvania Governor Josh Shapiro sued Character.AI on May 6, targeting the company’s chatbots for allegedly practicing medicine without a license.
The state said an investigation found that chatbots presenting themselves as fictional characters had claimed to be licensed medical professionals, including psychiatrists, available to discuss mental health symptoms with users.
A Character.AI bot named “Emilie” told a state investigator that assessing whether medication could help was “within my remit as a Doctor.”
The bot also claimed a Pennsylvania medical license and supplied an invalid license number. As of April 17, 2026, Emilie had logged approximately 45,500 user interactions on the platform.
What the state is demanding
The Shapiro administration is seeking a preliminary injunction and a court order to stop AI companion bots from posing as licensed professionals and providing medical advice. The case marks the first enforcement action of its kind announced by a governor in the United States.
“Pennsylvania law is clear,” Al Schmidt, secretary of Pennsylvania’s Department of State, said in a statement. “You cannot hold yourself out as a licensed medical professional without proper credentials.”
Character.AI said its characters are fictional and intended for entertainment, with prominent disclaimers in every chat reminding users that a Character is not a real person. The company said it does not comment on pending litigation.
A pattern of harm allegations
Character.AI has faced a string of lawsuits over harms allegedly linked to its chatbots. Kentucky filed suit in 2026 alleging its bots preyed on children and led them into self-harm. A Florida family settled a case against Character.AI and Google after their teenage son died by suicide, with the lawsuit alleging abusive and sexual interactions with the teen.
Governor Shapiro’s 2026-27 proposed budget calls on Pennsylvania’s legislature to require age verification for AI companion bots, mandate detection of self-harm mentions by minors, force reminders that no human is on the other side of the screen, and prohibit sexually explicit or violent content involving children.
As crypto.news reported, AI companies face growing regulatory pressure across multiple fronts in 2026, from cybersecurity liability to consumer protection enforcement.

