Internet

Consumer Advocates Demand Investigation into Blatant Deception by AI Companies Offering Therapy Bot Characters

WASHINGTON, D.C. — A broad coalition of consumer protection, digital rights, labor, disability, and democracy advocacy organizations led by CFA filed a formal request for investigation yesterday afternoon calling on state and federal regulators to investigate and enforce their laws against AI companies facilitating and promoting unfair, unlicensed, and deceptive chatbots that pose as mental health professionals.   

The complaint, submitted to Attorneys General and Mental Health Licensing Boards of all 50 states and the District of Columbia, as well as the Federal Trade Commission, illustrates how Character.AI and Meta’s AI Studio have enabled therapy chatbot characters to engage in the unlicensed practice of medicine, including by impersonating licensed therapists, providing fabricated license numbers, and falsely claiming confidentiality protections.  

“These companies have made a habit out of releasing products with inadequate safeguards that blindly maximizes engagement without care for the health or well-being of users for far too long,” Ben Winters, CFA Director of AI and Privacy said, “Enforcement agencies at all levels must make it clear that companies facilitating and promoting illegal behavior need to be held accountable. These characters have already caused both physical and emotional damage that could have been avoided, and they still haven’t acted to address it.” 

The filing, joined by 22 groups including the National Union of Healthcare Workers, American Association of People with Disabilities and Public Citizen warns that without swift action, AI companies will continue to jeopardize public safety, particularly for vulnerable users seeking support during a mental health crisis. 

The coalition urges immediate investigations, enforcement actions, and regulatory guidance to ensure AI tools cannot masquerade as licensed therapists or mislead the public about professional credentials.