Disclaimer

This document is for informational purposes only and does not constitute legal advice. Regulations are evolving rapidly. Consult with regulatory counsel for your specific situation and jurisdiction.

US State Regulations

Several US states have enacted or proposed legislation specifically addressing AI in mental health contexts. This represents the most active area of new regulation.

Illinois: Leading the Way

Nevada: AB 406

California: Privacy Focus

State-by-State Summary

State AI-Specific Law Relevant Privacy Law Key Focus
Illinois Proposed BIPA AI therapy restrictions
Nevada AB 406 NRS 603A Licensed oversight
California Proposed CCPA/CPRA Data privacy, consent
Colorado AI Act 2024 CPA Algorithmic discrimination
New York Proposed SHIELD Act Transparency, audits
Texas None TCDPA (limited) N/A

US Federal Regulations

FDA: Software as Medical Device (SaMD)

Enforcement Discretion

FDA currently exercises "enforcement discretion" for many low-risk digital health products, meaning they don't actively regulate "wellness" apps. However, claims of clinical efficacy or treatment can trigger oversight.

FTC: Consumer Protection

HIPAA

The "App Gap"

Many consumer mental health apps fall outside HIPAA because they're not covered entities and don't have relationships with covered entities. This means less federal privacy protection for sensitive mental health data—a significant regulatory gap.

European Union

EU AI Act

GDPR

Medical Device Regulation (MDR)

If mental health AI qualifies as a medical device, it falls under the Medical Device Regulation (EU 2017/745) with requirements for:

  • CE marking
  • Clinical evaluation
  • Post-market surveillance
  • Quality management system

Other International Frameworks

UK

Post-Brexit, UK is developing its own AI framework. The UK AI Safety Institute is leading on AI safety standards. MHRA regulates medical devices including software.

Canada

Health Canada regulates software as medical devices. PIPEDA governs privacy. Voluntary guidance on AI in healthcare.

Australia

TGA regulates medical device software. Privacy Act applies to health data. Voluntary AI Ethics Principles.

WHO Guidance

"Ethics and Governance of AI for Health" (2021) provides non-binding guidance emphasizing transparency, safety, and equity.

Compliance Strategy Considerations

Questions to Answer

  1. What claims are you making? "Wellness" vs. "treatment" has major regulatory implications.
  2. Where will users be located? Determines applicable jurisdictions.
  3. Is human oversight involved? Required by many regulations; reduces risk classification.
  4. What data are you collecting? Mental health data triggers heightened requirements.
  5. Who are you working with? Partnerships with covered entities trigger HIPAA.
  6. Does your product make clinical decisions? May trigger FDA and high-risk AI classification.

Safe Harbor Approaches

  • Wellness framing: Focus on general wellness support, not treatment claims
  • Human in the loop: Route clinical decisions to licensed professionals
  • Transparency: Clear disclosure of AI nature and limitations
  • Data minimization: Collect only what's necessary
  • Conservative claims: "May help" not "will treat"

Emerging Trends

Watch These Developments
  • More states likely to pass AI mental health laws
  • FDA clarifying digital health oversight
  • EU AI Act enforcement beginning 2024-2026
  • Professional licensing boards updating guidance
  • Potential federal US legislation

Professional Standards and Guidelines

Beyond legal requirements, professional organizations provide guidance on ethical AI use in mental health:

APA (American Psychological Association)

  • Guidelines for the use of technology in psychological practice
  • Emphasis on competence, informed consent, confidentiality

APA (American Psychiatric Association)

  • App Evaluation Model
  • Framework for assessing digital mental health tools

NHS Digital

  • Digital Technology Assessment Criteria (DTAC)
  • Evidence standards framework for digital health

NICE (UK)

  • Evidence standards for digital health technologies
  • Assessment criteria for NHS adoption