Complaint and Request for Investigation: Unlicensed Practice of Medicine and Mental Health Provider Impersonation on Character-Based Generative AI Platforms

A broad coalition of consumer protection, digital rights, labor, disability, and democracy advocacy organizations led by CFA filed a formal request for investigation today calling on state and federal regulators to investigate and enforce their laws against AI companies facilitating and promoting unfair, unlicensed, and deceptive chatbots that pose as mental health professionals.
The complaint, submitted to Attorneys General and Mental Health Licensing Boards of all 50 states and the District of Columbia, as well as the Federal Trade Commission, illustrates how Character.AI and Meta’s AI Studio have enabled therapy chatbot characters to engage in the unlicensed practice of medicine, including by impersonating licensed therapists, providing fabricated license numbers, and falsely claiming confidentiality protections.
Complaint and Request for Investigation Submitted by:
The Consumer Federation of America; Reset Tech; Tech Justice Law Project; the Electronic Privacy Information Center; AFT; AI Now; American Association of People with Disabilities; Autistic Women & Nonbinary Network; Bucks County Consumer Protection; Center for Digital Democracy; Center for Economic Justice; Common Sense; Consumer Action; Incarcerated Nation Network; Issue One; The National Union of Healthcare Workers; Oregon Consumer Justice; Public Citizen; Sciencecorps; Tech Oversight Project; the Virginia Citizens Consumer Council; and the Young People’s Alliance
Our Subject Matter Experts

Ben Winters
Director of AI and Data Privacy
Testimonies & Comments

Consumer Groups Oppose Addition of Segway Language Until Hearing on Safety Issues

CFA Opposes McConnell Medical Malpractice Amendment

Letter to President Bush on HHS Study of Medical Malpractice Insurance Rates
