Quantcast
Channel: Special Articles - Committee for Skeptical Inquiry
Viewing all articles
Browse latest Browse all 856

Psychology’s CAM Controversy

$
0
0

Poor psychology. Times have been tough lately for both psychological science and for the profession of clinical psychology.

First, there was the reproducibility controversy: a recent effort to independently duplicate the results of 100 recent psychology experiments found that only 39 percent of the findings could be replicated. This news, combined with recent reports of outright fraud, wounded the image of behavioral research and chastened psychologists to raise their standards for research methodology and peer review.

Then came the torture controversy. This past summer, an independent study found that the American Psychological Association (APA) had secretly collaborated with the George W. Bush administration in support of interrogation programs involving torture. Once this information was made public, many APA members were outraged at actions that stood in such obvious conflict with the association’s ethical code forbidding harm.1 Several resignations from APA’s administration followed, and the repercussions of the incident are still being felt.

Herb Medicine Advertisement

Meanwhile other controversies are simmering. A recent debate that has not received much attention involves professional psychology’s relationship to complementary and alternative medicine (CAM). For over half a century, professional psychology has been committed to a scientist-practitioner model. In the best programs, graduate training in clinical psychology involves both professional training in the application of psychology to human problems and training in behavioral science. Typically, the PhD in clinical psychology is granted only to candidates who have learned scientific methodology and conducted an original research project as part of their doctoral dissertations. The idea is that, even if psychologists go into private practice and never do research again, they should be able to read and evaluate the scientific literature for the purpose of providing the most effective treatments available.2

Despite this expressed grounding in science, professional psychology has struggled to be genuinely scientific. One clear example of psychology’s rocky footing is the use of CAM in clinical practice. Earlier this year, Australian psychologist Peta Stapleton and colleagues published a survey of 193 practicing psychologists in the United States, United Kingdom, Australia, and New Zealand. They found that almost all of the psychologists surveyed, 99.6 percent, had used at least one CAM technique in the past. In addition, 64.2 percent of psychologists had received “some level of more formalized training in at least one complementary or alternative therapy” (Stapleton et al., 2015, p. 193). Table 1 lists the CAM therapies surveyed by Stapleton and colleagues.3

Chakra chart

Table 1

  • Complementary and Alternative Medicine Therapies in Stapleton et al. (2015)
  • Acupuncture
  • Ayurveda
  • Biofeedback
  • Chelation therapy
  • Chiropractic care
  • Deep breathing exercises
  • Diet-based therapies
  • Energy psychology (e.g., emotional freedom techniques)
  • Vegetarian diet
  • Macrobiotic diet
  • Energy healing therapy
  • Folk medicine
  • Guided imagery
  • Homeopathic treatment
  • Hypnosis
  • Massage
  • Meditation
  • Megavitamin therapy
  • Natural products (nonvitamin and nonmineral; e.g., herbs and other products from plants, enzymes)
  • Naturopathy
  • Neurolinguistic programming
  • Timeline therapy
  • Prayer for health reasons
  • Prayed for own health
  • Others ever prayed for your health
  • Participate in prayer group
  • Healing ritual for self
  • Progressive relaxation
  • Qi gong
  • Reiki
  • Tai chi
  • Yoga
  • Other (please specify)

The sample in the Stapleton study was relatively small, but the investigators found that psychologists in New Zealand were significantly less likely to use CAM than those in the other three countries, and women psychologists were significantly more likely to use them than men.

Homeopathic Medicine

The use of CAM is not entirely surprising given that, in recent years, some respected psychologists have published articles and books encouraging the integration of CAM into clinical practice. In 2012, Loyola University of Maryland psychologists Jeffery E. Barnett and Allison J. Shale published an article titled “The Integration of Complementary and Alternative Medicine (CAM) Into the Practice of Psychology: A Vision for the Future” in the journal Professional Psychology: Research and Practice, which is published by the APA.4 In that article Barnett and Shale review what research there is in support of CAM and conclude:

Accordingly, to fulfill this ethical obligation, each psychologist will want to include discussion of reasonably available treatment options. For many presenting problems, this discussion should include various CAM modalities whose use for particular difficulties is supported by the relevant scientific literature. (p. 582)

Furthermore, in 2014, Barnett and Shale, with additional coauthors, released a book titled Complementary and Alternative Medicine for Psychologists: An Essential Resource, which was published by the American Psychological Association Press. These and other efforts within mainstream psychology are clearly giving legitimacy to the use of CAM. But is it justified?

Thankfully this is not the end of the story. Last month, Lawton K. Swan of the University of Florida and several coauthors published an article titled, “Why Psychologists Should Reject Complementary and Alternative Medicine: A Science-Based Perspective,” also in Professional Psychology: Research and Practice.5 This article was in direct reply to Barnett and Shale (2012) and, as the title makes clear, came to a very different conclusion.

Swan and colleagues make a strong case for a science-based approach to practice, not merely an evidence-based one. In a science-based approach, randomized control trials are highly valued, whereas case studies and investigations that do not include well constructed control groups are given less value. Swan and coauthors constructed the following hierarchy for the quality of research evidence, which they based on a 2006 model proposed by the APA Presidential Task Force on Evidence Based Practice.6 At the top of the hierarchy are randomized control trials (RCTs) and systematic summaries of groups of RCTs, most often done in the form of meta-analyses.

Figure 1. A hierarchy of research evidence based on the 2006 APA Presidential Taskforce on Evidence Based Practice (from Swan et al. 2015).

In a rather Herculean effort, Swan and colleagues went on to perform a systematic re-review of the evidence cited by Barnett and Shale, with a particular attention to the quality of the control groups employed in each study. First, they evaluated the individual RCTs cited by Barnett and Shale and found that only three out of ten studies had adequate control groups. Of these three, only one showed positive effects for the CAM in question, Reiki. Next, they went on to evaluate all the individual studies summarized in the meta-analyses cited by Barnett and Shale. Of the 230 studies included in these meta-analyses, only thirty had adequate control groups. Of the remaining studies, most (57 percent) failed to show a significant effect of the technique under study. Some of the more common problems within the 200 excluded studies were that investigators had failed to blind participants to the independent variable of the research—setting up the possibility of placebo effects—and that the study had never been described as an RCT in the first place.

At the conclusion of their re-review of Barnett and Shale, Swan and colleagues made the following summary statement:

From a science-based perspective, the highest quality empirical studies (RCTs) cited by Barnett and Shale as evidence for CAM’s efficacy in fact support the opposite conclusion—that CAMs either have not been properly evaluated or simply do not work better than credible placebos. (p. 329)

It is encouraging that a rigorous review of CAM therapies with such a strong conclusion should be published in an APA journal, but if the Australian survey of professional psychologists is any indication, there are many practitioners using unsupported methods with their clients. Part of the problem may be that the evidence-based movement is still quite young and that many clinicians were trained long ago. “I suspect we will only really know the impact of the evidence-based practice movement when the generation trained within that framework become the majority of practitioners,” said Sondre Skarsten, one of Swan’s coauthors and a graduate student at the University of Chicago Center for Decision Research.

Recently it has been proposed that a distinctive accreditation be established for psychology training programs committed to a more rigorous science-based practice approach7. This would go a long way toward making it clear which practitioners were using the most scientifically grounded techniques and which were not. Unfortunately, Swan points out that little progress has been made in that direction yet, and “in the meantime, the public endures far, far too much bunkum from a profession that fancies itself a member of the evidence-based elite.”8

References

  1. See http://www.apa.org/ethics/code/index.aspx . Principle A under General Principles.
  2. Baker, D.B., and L.T. Benjamin Jr. 2000. The affirmation of the scientist-practitioner: A look back at Boulder. American Psychologist 55(2): 241–247.
  3. Stapleton, P., H. Chatwin, E. Boucher, et al. 2015. Use of complementary therapies by registered psychologists: An international study. Professional Psychology: Research and Practice 46(3): 190–196.
  4. Barnett, J.E., and A.J. Shale. 2012. The integration of complementary and alternative medicine (CAM) into the practice of psychology: A vision for the future. Professional Psychology: Research and Practice 43(6): 576–585.
  5. Swan, L.K., S. Skarsten, M. Heesacker, et al. 2015. Why psychologists should reject complementary and alternative medicine: A science-based perspective. Professional Psychology: Research and Practice 46(5): 325–339.
  6. Anderson, N.B. 2006. Evidence-based practice in psychology. American Psychologist 61(4): 271–285.
  7. Baker, T.B., R.M. McFall, and V. Shoham. 2008. Current status and future prospects of clinical psychology toward a scientifically principled approach to mental and behavioral health care. Psychological Science in the Public Interest 9(2): 67–103.
  8. Citing limitations of space and time, Jeffrey Barnett declined to comment for this article.

Viewing all articles
Browse latest Browse all 856

Trending Articles