July/August 2009 Issue
Understanding Evidence-Based Practice in Behavioral Health
By David Surface
Social Work Today
Vol. 9 No. 4 P. 22
Clarifying misconceptions about evidence-based practice is the first step to reducing professional resistance to it.
During the past 10 to 15 years, there has been an increased focus in the behavioral health community on delivering what is known as evidence-based practice (EBP). Some of those practices, such as motivational interviewing and psychoeducational-supported employment, are now common practice in many behavioral health settings. Yet EBP is not as widespread as proponents would like and still faces resistance due to a number of factors, including a basic misunderstanding among behavioral health professionals of what EBP actually is.
According to Haluk Soydan, PhD, director of the Hamovitch Center for Science in the Human Services at the University of Southern California School of Social Work, part of the misunderstanding is due to the fact that the term EBP is often used to refer to two related but different things.
“The term ‘evidence-based practice’ was originally used to describe a process,” Soydan says. “Later on, people started using the same term to refer to any practice that has some kind of acceptable evidence that supports the treatment model. So there’s a confusion in the literature and among social workers who, when they use the term, are often thinking of specific evidence-based practices, not the process.”
Despite the confusion—or perhaps because of it—EBP has become something of a buzzword in the behavioral health community. According to Joan Levy Zlotnik, PhD, ACSW, executive director of the Institute for the Advancement of Social Work Research, a significant number of practice books published in the past two years have the term “evidence-based practice” in their titles.
One of the key issues in the discussion of EBP is how “evidence” is defined. “When you’re talking about what works, you have to be very clear about what that’s based on,” says Zlotnik. “Whatever methods you use to identify what the level of evidence is, you need to be transparent about it.”
In medical research—where the term EBP was first coined—randomized clinical trials are often considered the gold standard of research. “In social work, that’s more difficult,” Zlotnik points out. “It’s very different than when you’re testing medication.”
Zlotnik advises researchers to be explicit not only about the methods they used to gather their evidence but also about the specific outcomes they were looking for: “Was it recovery? Was it the ability to go back to school? Was it related to employment? Was it related to getting along well with different people?”
EBP: How Widespread Is it?
The question of how widespread EBPs in public behavioral health settings actually are is a difficult one because there are currently no systematic studies measuring its implementation and because the pendulum swings widely when it comes to defining what an EBP actually is. Nevertheless, Soydan believes it is safe to say that EBP in behavioral health settings is indeed on the rise.
“I think the infrastructure to develop evidence-based practices has been strongly growing around the globe,” says Soydan. According to him, it started with two knowledge-generating networks of researchers, practitioners, and funders: the Cochrane Collaboration, which produces and disseminates systematic reviews of effectiveness studies in healthcare (including behavioral health) and was launched in 1993, and the Campbell Collaboration, which provides the same kind of systematic review of effectiveness studies in social welfare, education, and criminal justice. “These two have been extremely influential in setting the terms of evidence,” says Soydan.
The other major contributor to EBPs’ growth in public behavioral health settings has been the appearance of so-called clearinghouses, or Web-based databases that retrieve the best possible evidence on what works and reframes the information into plain language that end users such as social workers can more easily understand.
Despite these available informational and technological resources, the shift in knowledge and practice prompted by the appearance of EBPs is still largely dependent—like all such changes—on the human factor.
Bonnie Spring, PhD, a professor of preventive medicine and the director of behavioral medicine at Northwestern University Medical School, explains, “I think there’s naturally a lag because for some people who’ve been in the field for a long time, this is a different way of training than they’ve been exposed to. I think we need to do better in disseminating effective ways to train people in EBP. I think that’s been a gap for a long time, and it takes a while to change.”
Education and Training
It’s one thing to research the efficacy of a treatment and make the results public, but it’s another thing to put that treatment into practice. To that end, the National Institute of Mental Health presented a conference in 2007, “Partnerships to Integrate Evidence-Based Mental Health Practices Into Social Work Education and Research.” One of the conference organizers was Denise Juliano-Bult, MSW, chief of the systems research program at the National Institute of Mental Health.
“One of the main barriers to having these practices available is that there aren’t enough practitioners out there to deliver them,” says Juliano-Bult. “Teaching EBPs in the primary curriculum when people are being trained to begin with will reach many more practitioners than through continuing education alone.”
“The hard challenge is to change what’s happening in the field, in practice placements,” says Spring. “We decided to begin by targeting people who are still in school and then address people who are already in practice. I think it’s necessary to do both things. You have to change the culture, and it’s the younger generation that’s usually more receptive to that.”
While more MSW students are learning about EBP during their studies, many are finding it difficult to make use of that knowledge in their field placements. “In the field placements, you have newbies being mentored by people who have not been trained in this new methodology,” says Spring. “It’s challenging to expect a young person to be able to come in and tip the practice tradition.”
The difficulty that new social workers often encounter in implementing EBP during their field placements is not always due to deliberate resistance. “I believe that most field sites aren’t against this idea,” says Juliano-Bult. “They just don’t have a lot of extra time and resources to add to what they’re already doing.”
Some schools of social work are taking steps to bridge that gap. The University of Michigan School of Social Work, for example, has begun offering minicourses in EBP to field site supervisors and has made the university library and search engines available to the field sites, so they can more easily research which EBPs to implement in their organizations.
Like every change in professional culture, the shift to EBP in behavioral health has met resistance, some of it quite strong, perhaps because this issue cuts right to the heart of a controversy that has been brewing for many years—the actual role of science in the social sciences.
Science works with categories and averages; psychotherapists, by contrast, work with individuals. It’s the individual approach that some in the behavioral health field believe may be threatened by the growth of EBP. Some therapists have expressed concern that a “lockstep adherence” to a set of codified treatments may prevent the therapist from exercising his or her own individual judgment of what is best for a particular client, especially when it deviates from the evidence-based model.
According to Spring, it is this “one-size-fits-all” concept of EBP that its critics dislike the most. “You’ve got your nose in a treatment manual, you’re treating everybody the same,” Spring says. “And you’ve lost all of the good, nonspecific treatment elements like relationship building, the therapeutic alliance, empathy, warmth.”
The proponents of EBP point out that this criticism is based on a lack of understanding of EBP as a process, one in which practitioners apply what is good from the research without being tied to a specific kind of codified treatment.
Spring believes that EBP and traditional therapeutic values are not incompatible. “You don’t want EBPs to be done robotically,” says Spring, “You want them to be done by caring, skilled, empathic therapists who understand the principles of the treatment. They don’t have to follow the manual like a robot would.”
Still, when it comes to ensuring the best outcomes for patients in behavioral health settings, “strict adherence to the manual” may not always be a bad thing. Juliano-Bult points out that when professionals insist on adhering to a manualized treatment, it’s often because that’s the best evidence available.
“The truth is, we probably don’t know enough about the details of these proven models to know how to personalize them for different people,” she says. “When you’re giving someone medication, it’s easier to make an adjustment based on body weight or side effect profile. But when it comes to behavioral health, there’s a lot we don’t know, and that is an area ripe for social work research.”
The most obvious way to determine whether a treatment is helpful is by measuring outcomes. When clinical outcomes don’t match outcomes cited in the research, adjustments are needed.
“Suppose you start giving the treatment that had the very best research support,” says Spring. “Three months into the treatment, the client is deteriorating, but you’re continuing to give the same treatment. I’d say that you’re no longer doing evidence-based practice. Part of the evidence-based practice process is that you have to analyze and adjust. You need to measure how you’re doing because if you’re implementing a best practice but the client is deteriorating, you’ve got to change course. Just doing what the overall body of research says is only where you start. From there on, you make choices based on what your own client’s data show is working.”
Granted that EBP as a process is a concept that most behavioral health practitioners may understand, but insurers may not grasp this concept as easily. So there is concern that, in their quest to identify and reimburse only specific, manualized practices that are cost-effective, insurers may cut the lifeline for practices that do not yet have the “evidence-based seal of approval.”
While Soydan understands this concern, he believes that implementation of EBPs with the best cost-effectiveness is to every party’s advantage and ultimately about building confidence between insurers, social workers, and clients. “Of course, the insurance companies want to save money, so they want the practitioners to use treatments that have a proven positive impact on the clients,” says Soydan. “Who doesn’t want that?”
Spring believes that the fact that insurance companies don’t have a broader, more inclusive view of EBP is at least in part the responsibility of the behavioral health community. “We need to come together and tell insurance companies how to measure good outcomes,” says Spring. “As long as they show that the patient is getting better, you should keep paying for the treatment. If we would come together and do that and not just be worrying about how to keep collecting payments for the things we’ve always done, then this situation would be different.”
For EBP to take hold and flourish in the behavioral health community, its proponents recognize the need to take individual differences into account and integrate those considerations into their approach.
Zlotnik points out that a progressive, inclusive approach to EBP should combine the evidence provided in the manual and the practitioner’s knowledge, wisdom, and ethics, as well as the client’s culture, individual interests, and needs. “Those interventions that are drawn from systematic reviews of research studies around a particular intervention are all based on averages, and it doesn’t take into account the ‘outlier,’ or the person for whom that intervention doesn’t work,” says Zlotnik. “So you have to have a system where you can adopt and adapt things to people’s individual needs.”
While Juliano-Bult recognizes that the debate over EBP is likely to continue, she believes that progress will still be made. She points out that integrating EBP into social work education, practice, and research is still an agenda item for the National Institute of Mental Health. “In the meantime, while people might be in disagreement about the particulars, there’s a lot that can be done to help us achieve the common goal of improving outcomes for clients,” she says.
Soydan believes that there is an overriding ethical issue involved in the struggle to adopt EBP in behavioral health settings. “The basis of social work, like any human services profession, is not to harm,” says Soydan. “If this is the value we base our profession on, then it’s critical to know that the interventions we use are not potentially harmful, that there is acceptable, high-quality evidence that this treatment works or at least does not harm the client. From that perspective, evidence-based practice as a professional culture becomes a must.”
— David Surface is a freelance writer and editor based in Brooklyn, NY. He is a frequent contributor to Social Work Today.