Home  |   Subscribe  |   Resources  |   Reprints  |   Writers' Guidelines

January/February 2017 Issue

Technology Trends: Developing Standards for Mental Health Apps
By Susan A. Knight
Social Work Today
Vol. 17 No. 1 P. 6

Of the more than 165,000 health apps on the market, a sizable number are targeted at users with mental health issues. With so many apps readily available for anyone to download and use, there are growing concerns regarding the lack of standards for these apps and the potential impact on users.

"There is no governing body, oversight, or approval process that apps have to go through," says Ken Weingardt, PhD, scientific director at the Center for Behavioral Intervention Technologies and an associate professor in the department of preventive medicine at Northwestern University's Feinberg School of Medicine. The need for some sort of oversight or standards, he says, is very much on everyone's radar.

"The FDA has taken a hands-off approach to apps in general," says John Torous, MD, codirector of the digital psychiatry program at Beth Israel Deaconess Medical Center in Boston. Torous serves as a staff psychiatrist and clinical informatics fellow at the center, and is the editor-in-chief for JMIR Mental Health, the leading academic journal on technology and mental health.

General Wellness vs. Mental Health Apps
Torous explains that with the barriers to entry for app development being as low as they are, anyone can put out a mental health app and make bold claims about it. As a result, the arena has been flooded with thousands of apps of differing quality and reliability. In such an environment, it can be difficult to determine whether a given app is safe, effective, and in line with established best practices. This poses less of a concern for apps designed to support and promote general wellness, but it becomes a greater concern when dealing with mental health apps.

General wellness apps typically present little to no risk to users, so "There is far less of a need for rigorous evaluation measures and regulation," Weingardt says. But where apps are being used in the treatment of diagnosable mental health conditions, the standards applied should naturally be higher.

The good news is that, with a few exceptions, the vast majority of mental health apps are unlikely to cause direct or serious harm to most users. However, an app that fails to live up to its promises can still be problematic. Torous explains that even if it doesn't cause outright harm, an ineffective app that doesn't fulfill its stated claims can have negative consequences for a patient.

He outlines a scenario where a patient uses an app but fails to achieve the promised results, such as a reduction in symptoms or an improvement in mood. The patient might incorrectly conclude that this poor outcome is due to some personal failing, or that his or her situation is uniquely difficult and unresponsive to care. In reality, it may simply be that the app itself was ineffective and unable to deliver what was being promised for this patient in this particular situation.

There are no requirements regarding the amount of input the app developer receives, if any, from health care experts during the development process. Once again, this has different ramifications depending on the type of app. For those apps in the category of general wellness, input and consultation with health care experts is less of an issue. Whereas in the case of a mental health app, such input and consultation is essential for ensuring the app's effectiveness and reliability. "If the app is making therapeutic claims," Torous says, "clinician involvement becomes more important. You want an expert in the space offering guidance."

Weingardt also believes that input and consultation from clinicians is critical in the development, delivery, and evaluation process. "Clinicians absolutely need to be involved at multiple points throughout the process," he says. "Their perspective is vitally important."

Risks, Benefits, Usability, and Interoperability
The volume and diversity of mental health apps, along with their dynamic nature, make it difficult to develop a single set of standards or a scoring system that can be applied across the board. In order to assist patients and clinicians with app selection, the American Psychiatric Association Smartphone App Evaluation Task Force has taken a different approach. Torous, who chairs the task force, explains that the focus is to "provide people with education and relevant resources, so they can make an informed decision when selecting an app." To facilitate this, an easy-to-understand, four-level framework has been developed, encompassing risks, benefits, usability, and interoperability.

In assessing risk, people are encouraged to look at an app's privacy and safety features. This includes aspects such as password protection, data encryption, and options for data deletion. It's also important to look at how the data collected will be used. "What happens to your personal information once you enter it?" Torous asks. With mental health apps, people are typically providing large amounts of personal and health information. This information is extremely valuable for marketing purposes, and it's quite possible that the information collected is being sold. "People often don't realize how much of their information is being disclosed to third parties," he says.

As a first step, Torous advises that users check the app's privacy policy. "If the app has no privacy policy," he says, "that's a red flag."

Assessing an app's proposed benefits is important so that people don't waste their time with a tool that is likely to be ineffective. Torous offers some key questions to ask: "Is there any evidence that this app is going to be helpful?" and "Does it align with clinical best practices?"

An app's usability is also important. People won't follow through with using an app, even after going to the trouble of downloading it, if it's not well designed and user friendly.

Lastly, interoperability is an important factor in order to support the sharing of relevant health information with the patient's other care providers.

While no app is perfect, consideration of these factors will equip people to make more informed decisions about which apps they wish to use based on the level of privacy afforded, safety, and the overall likelihood of effectiveness.

Alignment With Treatment Goals
Even the most secure, user-friendly, evidence-based app will be of little value if it is ill suited to the person and/or the situation. For any app to be effective, it needs to meet the patient's needs and be in alignment with the patient's treatment goals. With this in mind, social workers and other care practitioners are perfectly positioned to offer guidance and support to patients in selecting and using apps.

If a patient plans to use a particular app, the patient should bring the app to the clinician's attention. The clinician, having been made aware of the app, can then offer input and support for its use. "It needs to be a collaborative approach," Torous says, "with a conversation taking place between the clinician and the patient. Both sides need to come together and talk about the risks and benefits.

"You want to have all your treatment goals aligned," he continues, "and everything to be linked, in order to get the best clinical outcomes."

Weingardt further highlights the benefit of such patient-clinician collaboration: "When you integrate these tools into the delivery of clinical care, the clinician is right there in the process. The clinician can check to see if the app aligns with the treatment being provided."

Weingardt also cites the role of clinicians in introducing apps to their patients, and the need for more collaboration in this regard with health care providers. "You don't implement these apps in a vacuum," he says. "We absolutely need to have clinicians involved in rolling out the apps. We need to ask, 'How can we work with clinicians and practitioners in integrated care to get these apps to patients?'"

Trust, Transparency, and Evidence
With so many mental health apps available, but no guarantees regarding their soundness and efficacy, app selection presents ongoing challenges for both patients and care practitioners. Trust, transparency, and evidence are essential elements that need to be present, Torous says. Going forward, we can expect to see people increasingly taking those elements into consideration as part of the app selection process. "We're going to see people demanding apps that are more effective, and that offer privacy protection," he says, along with a growing demand for apps that are medically tested and evidence based.

Weingardt sees enormous potential for mental health apps as an adjunct to therapy, with practitioners leveraging them to provide even more effective patient care. But more research is needed to support this. "I would like to see more robust research to demonstrate that these apps are effective," he says, "so we can say with certainty that they really do work."

Notwithstanding the still evolving landscape with regard to standards, research, and validation, Weingardt encourages clinicians to embrace the use of apps and other related tools. "Technology is not competition for health care practitioners and mental health service providers," he says. "It's a way of improving and enhancing the services they provide. Don't be afraid of these new technologies. They not only have the potential to make patient care more efficient, they provide a real opportunity to improve health outcomes."

— Susan A. Knight works with organizations in the social services sector to help them get the most out of their client management software.