Pennsylvania is suing Character AI, alleging one of its chatbots falsely posed as a licensed psychiatrist and offered medical advice.
Pennsylvania is suing Character AI, alleging the company’s platform allowed a chatbot to present itself as a licensed medical professional and provide medical advice without proper credentials.
The lawsuit says a chatbot falsely claimed to be a licensed psychiatrist in Pennsylvania and gave an invalid license number. State officials accused the company of violating Pennsylvania’s Medical Practice Act, which governs the medical profession and licensing requirements.
“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Gov. Josh Shapiro said in a statement.
The case centers on an exchange described in the lawsuit between a state investigator and a Character AI chatbot named “Emilie.” After the investigator created an account, the chatbot allegedly described itself as a psychology specialist who had attended Imperial College London’s medical school.
When the investigator said he had been feeling sad and empty, the chatbot allegedly raised depression and asked whether he wanted to book an assessment. Asked whether it could assess if medication might help, the chatbot allegedly said it could because that was “within my remit as a Doctor,” according to the lawsuit.
The state is asking a court to immediately stop the alleged conduct. Al Schmidt, Pennsylvania’s secretary of the Department of State, said state law is clear: “you cannot hold yourself out as a licensed medical professional without proper credentials.”
Character AI said it would not comment on pending litigation. In a statement, the company said it uses “robust disclaimers” telling users not to rely on characters for professional advice and that user-created characters are fictional, intended for entertainment and roleplaying.
The lawsuit adds to broader scrutiny of Character AI, a platform founded in 2021 that lets users interact with personalized AI-powered chatbots. Multiple families sued the company last year, alleging the platform contributed to teens’ suicides or mental health crises; the company agreed to settle several of those lawsuits earlier this year. Character AI also announced safety measures last fall, including restrictions on back-and-forth chatbot conversations for users under 18 and directing distressed users to mental health resources.
The Pennsylvania case now turns on whether a court finds the alleged chatbot conduct crossed the line from fictional roleplay into unlawfully representing medical credentials or offering medical advice.
Comments (0)