Skip to main content
Main content
Economy

Shapiro admin alleges company’s AI chatbots illegally pose as doctors

by Associated Press |

Gov. Josh Shapiro stands at a lectern.
Gov. Josh Shapiro stands at a lectern.
Commonwealth Media Services

HARRISBURG — Pennsylvania has sued an artificial intelligence chatbot maker, saying its chatbots illegally hold themselves out as doctors and are deceiving the system’s users into thinking they are getting medical advice from a licensed professional.

The lawsuit, filed Friday, asks the statewide Commonwealth Court to order Character Technologies Inc., the company behind Character.AI, to stop its chatbots “from engaging in the unlawful practice of medicine and surgery.”

The lawsuit said an investigator from the state agency that licenses professionals created an account on Character.AI, searched on the word “psychiatry” and found a large number of characters, including one described as a “doctor of psychiatry.”

That character held itself out as able to assess the investigator “as a doctor” who is licensed in Pennsylvania, the lawsuit said.

While You’re Here

Spotlight PA’s nonprofit reporting is a free public service, but it depends on your support. Give now to ensure it can continue.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Gov. Josh Shapiro said in a statement. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

Character Technologies did not respond to an inquiry Monday.

The company has faced several lawsuits over child safety. In January, Google and Character Technologies agreed to settle a lawsuit from a Florida mother who alleged a chatbot pushed her teenage son to kill himself. Last fall, Character.AI banned minors from using its chatbots amid growing concerns about the effects of artificial intelligence conversations on children.