Telehealth via TikTok is Not Protected Under HIPAA, but Zoom Is: What Attorneys Need to Know About Mental Health Apps
12.11.2020
The future of health and wellness is in your smartphone.
With apps that make it easy to track your calories, number of daily steps, or monitor your heart rhythm, consumers are more empowered than ever to take control of their health.
But what about apps geared for mental health, such as those related to relaxation, stress management or sleep? In addition to helping with basic wellness goals, these apps may also offer self-help tools, therapeutic activities and access to treatment delivered by licensed mental health professionals.
Ada Chan (Gardner Weiss & Rosenblum,) discussed the various risks and challenges presented by mental health apps on the CLE Webinar, “Overview of Mental Health Apps And Legal Privacy Concerns.”
In its simplest terms, mental health apps aim to improve mental health and well-being, ranging from guiding mental illness recovery to encouraging beneficial habits that improve emotional health.
Chan noted that some companies have started to offer mental health app subscriptions as part of wellness packages. She said it is a good idea to offer these due to increasing anxiety and depression caused by the pandemic. “This could be a very good trend to have even in the future,” she said. Improved accessibility is especially important in rural areas.
Wellness medical providers have used telemedicine and doctor-on-demand services more during the pandemic and telepsychiatry and teletherapy have increased. Chan noted that apps might have a fine print disclaimer that it is intended as a supplement but not a replacement for therapy.
Chan said there are hundreds of thousands of individual apps devoted to health and wellness, which is “impossible to comb through.” There is doubt about how effective certain apps are until some research is done. There also is concern that apps are designed to aid research rather than meet the needs of the user.
Empathic response chatbots that are able to understand a user’s emotional state and respond to it on an appropriate emotional level are also becoming more frequent. In the works are emotional chatting machines that can respond even better to questions and statements. Speech recognition software can now detect post-traumatic stress disorder and depression with 89% accuracy. Chatbots might suggest that the user seek immediate help, but nothing further. An artificial intelligence algorithm might detect and call law enforcement. Facebook Artificial Intelligence scans flagged posts and has human moderators determine whether to call law enforcement.
With people opening up about their mental health, how is privacy protected? Chan said that the Health Insurance Portability and Accountability Act (HIPAA) penalizes covered entities and their business associates for the release of covered health information and sets up national requirements for the disclosure of data breaches. Covered entities include most health care providers, many health plans and health care clearinghouses.
The Health Information Technology for Economic and Clinical Health Act (HITECH Act) expanded HIPAA’s coverage to business associates, those that provide a service for or on behalf of a covered entity, such as outside transcription services and billing services.
The Federal Trade Commission’s Health Breach Notification Rule & Section 5(A0) of the Federal Trade Commission Act prohibits “unfair or deceptive acts or practices in or affecting commerce.
“It covers everything that HIPAA does not cover,” said Chan. “It is more expensive to notify people of the FTC health breach than it is to pay the fine.”
The FTC treats each violation as an unfair or deceptive practice in violation of a Federal Trade Commission regulation. Businesses that violate the rule may be subject to a civil penalty of up to $43,280 per violation.
Wellness programs offered by employers are covered under HIPAA if offered as part of a group health plan. If the employer is storing information on behalf of the group plan, the employer is also subject to HIPAA rules. Violations range from civil penalties of $100 for failure to comply with a privacy rule requirement to $250,000 and ten years imprisonment for persons with the intent to sell, transfer or use individually identifiable health information ‘”for commercial advantage, personal gain or malicious harm.”
The Food and Drug Administration may regulate certain apps as software medical devices. For example, EndeavorRx, an FDA-approved video game used to treat children ages 8 to 12 with attention deficit hyperactivity disorder (ADHD), is HIPAA-protected and requires a prescription. Even if it poses a low risk to the patient, the FDA has discretion in enforcement.
Chan said that teletherapy/telepsychiatry apps are protected by HIPAA. A mental health app must follow strict rules to protect privacy under HIPAA, chief among them automatic logoff and end-to-end encryption.
However, due to COVID-19, covered health care providers may use popular applications such as Zoom and Facetime that allow for video chats to provide telehealth, unless the application is public facing such as Facebook Live or TikTok.
Chan examined two common mental health apps on the market: MindDoc and Woebot. MindDoc screens for and alleviates depression. Screening results could be presented to a psychologist for diagnosis. User information is encrypted and saved on its server, but users’ information, not their identity, will be sent to universities and partners for research. Woebot started as a Facebook Messenger service. Conversations cannot be deleted and they may access information from third parties. “Facebook owns every conversation you had with Woebot,” said Chan.
The U.S. Department of Veterans Affairs maintains its own apps designed specifically for veterans. She said the controversial PTSD Coach App from Veterans Affairs has been “severely criticized” for privacy concerns. The app asks for access to your phone’s contacts, pictures and user files.
New York does not allow private claims against data breaches, but allows enforcement by the state’s attorney general. The “Stop Hacks and Improve Electronic Data Security” (SHIELD) Act requires businesses to implement reasonable safeguards for the “private information” of New York residents and broadens New York’s security breach notification requirements. The SHIELD Act applies to lawyers and law firms of all sizes. The security requirements took effect on March 21. If businesses violated the act knowingly or recklessly, the court may impose a civil penalty.