It has been argued that critical, scientific thinking is not a natural characteristic of humans.(Achenbach 2015) The prevalence of superstitions in every human culture is a testament to this fact. Modern societies are plagued by the superstitions of religions, astrology, new age medicine, and Deepak Chopra (as such a master of vacuous nonsense, I thought he deserved his own category). Despite the human propensity for magical thinking, science has risen to phenomenal heights, revealing to us amazing facts about the universe and its history. In conjunction with human imagination and ingenuity, science has given rise to mindblowing technologies, like computers, cell phones, jetliners, and spacecraft. And yet, even in the twenty-first century – the technological age – “smart” people are not immune to the lure of superstition. Francis Collins, the director of the National Institutes of Health, and former director of the Human Genome Research Institute, is an evangelical Christian. William Newsome, neuroscientist and member of the National Academy of sciences, is a “serious Christian.”(Newsome 2006) People who consider themselves to be wholly rational might be asking themselves, “okay, how are these people, who are undeniably smarter than I am, able to reconcile their science with their superstition?” It is a good question. Science and superstition seem to require thinking patterns that are polar opposites. And yet there is no shortage of scientists that exhibit irrational thinking patterns. Michael Shermer addresses this phenomenon in his book, Why People Believe Weird Things, specifically in the last chapter, entitled, “Why Smart People Believe in Weird Things.”(Shermer 2002)
I have this former friend – a bright, beautiful young lady with aspirations to get her Ph.D. in neuroscience – who lost her shit one day when I went on a rant about naturopathy. Considering how smart she seemed, I was sort of caught off guard. I later found that she worked for Brain State Technologies, a company that sells biofeedback materials, specifically “brainwave entrainment software.” Turns out that this… product… is apparently marketed to and used by naturopaths for treatment of all sorts of disorders, from insomnia to ptsd – all without any convincing evidence or acceptance by the science-based medicine community.
After a little research into the company via its website and publications, the red flags started popping up. The website itself claims that the BRAINtellect® technology in particular has helped clients get better sleep, relief and improved adaptability to stress, significant improvement in memory and recall, and increased performance at work, athletics, learning, and more. The technology itself bears all of the hallmarks of an elaborate placebo, but even more revealing was the published research by Brain State researchers and their collaborators at Wake Forest. As always, when something sounds too good to be true, it probably is.
I uncovered six published research articles and one review. There was also one letter penned by a few of the Brain State and Wake Forest researchers, namely Charles Tegeler, Sung Lee, and Hossam Shaltout, to the Journal of the American Medical Association Psychology (JAMA Psychology). The six research articles were all published in open access journals, all of which have 2014 impact factors between 2.0 and 3.7, and SCImago Journal and Country Ranks (SJRs) ranging from 0.95 to 1.7 Impact factors and SJRs are indicators of the relative contribution of a journal to the field and to science in general. Impact factors are calculated by determining the average number of citations per published article over the previous two years. When assessing the importance of a journal to its respective field, you can compare the journal’s impact factor with that of the top journals in the field. Since the articles published by the Brain State Tech/Wake Forest research group are arguably related to the field of neuroscience and psychology, I compared the impact factors of the journals in which the researchers published their research with the top journals of neuroscience and psychology. The three top research journals (not review journals) in neuroscience were Nature Neuroscience, Neuron, and Biology of Mood and Anxiety disorders, with 2014 impact factors of 13.8, 12.0, and 7.0, respectively. The 2014 SJRs for these respective journals were 10.5, 10.2, and 7.1. The top three research journals in experimental psychology were the Journal of Experimental Psychology, Cognitive Psychology, and the Journal of Memory and Language; the respective 2014 impact factors were 5.3, 4.9, and 4.4; the respective 2014 SJRs were 3.4, 2.9, and 2.4. For perspective, this means that in the four year timespan in which these articles were published, the Brain State/Wake Forest researchers could not manage to get even one article into a second tier journal, let alone a top tier one. In addition, all of the journals in which they are published are open access, and these journals may be less selective than their subscription counterparts.
Okay, so these guys published in so-so journals that aren’t known for having articles that make a spalsh. But what really tells you how impactful individual papers are is the citation count – how many citations, and by whom. I noticed that none of the articles by the Brain State group are well-cited, with the highest number of citations on a single article being only six.(Gerdes, Gerdes et al. 2013) Just for comparison, one of the post docs at my last job had papers with anywhere from 20-42 citations, some of which have been published for less than a year.
There’s a good article in Popular Mechanics that explains how to determine if an article or research group is bullshit. It is entitled, “6 Warning Signs That a Scientific Study is Bogus.”(Fecht 2014) I pulled the following paragraph from it:
“Do the Researchers Cite Their Own Papers? If so, this is a red flag that they are promoting views that fall outside the scientific consensus.”(Fecht 2014)
Let’s go back to Brain State’s/Wake Forest’s publications on brainwave entrainment. I checked all of the citations those articles received and I found that all but one of those citations were from within the collaborative team. In other words, the only people citing those papers were the authors themselves. Every paper but one was written by some combination of the same 8-10 people. The outlying paper cited one of the Brain State papers merely to acknowledge its existence as another paper acknowledging the need for research into interventions for PTSD.(Lee, Gerdes et al. 2014, Ginsberg, Pietrabissa et al. 2015) But it did not reference anything in the Brain State paper at all, indicating that the authors either didn’t read it, or didn’t care about the content of the paper they cited.
The neurologist Steven Novella has written articles in Neurologica on neurofeedback and brainwave entrainment software vendors, and their conclusions are basically that it’s all pseudoscientific quackery, an elaborate marketing ploy to con scientifically illiterate naturopaths to purchase training and materials to treat placebo-susceptible patients.(Novella 2007, Novella 2008)
This is a lesson about the process of science. First of all, just because there is literature on something, doesn’t mean there is true EVIDENCE that it works. Second of all, if something is not widely accepted by the medical community, then more likely than not, it just doesn’t work, or it relies on premises that run counter to fundamental scientific principles, or it contradicts robust evidence in the field. In this case, there is robust evidence that biofeedback treatments have limited applicability, due to the paucity of evidence for efficacy.
Knowing how science works can protect you, as a potential client or patient who might be throwing away precious time and money on garbage that doesn’t work. It can also protect you as an ambitious young scientist from being drawn into the world of snake oil and pseudoscience, effectively alienating you from the scientific community.
While science is supposed to be an impartial process of gaining knowledge, humans who conduct science are fallible. That’s why we have peer review. Just remember that peer review isn’t over once something is published. Either ideas stand on their merit, or they fall, sometimes grasping desperately, and persistently, for purchase.
What does this mean for the people in your life who fall prey to irrational thinking? It’s hard to tell whether it makes a huge difference how “smart” or educated a person is; when someone is in the grips of irrational ideas, they tend to hold to them, and evidence be damned. When it comes to ideas, I believe that most people are unable or unwilling to let them go, clingin to them while vehemently condemning any and all criticism. The “sunk cost bias” may go some way to explaining this: the sunk cost bias, or the sunk cost fallacy, occurs when someone has invested so much that he or she cannot abandon their effort or idea, no matter how much evidence points to its inevitable failure. Abandoning the idea means acknowledging a significant lost investment.
My friend has invested in neurofeedback. She has invested years of her life, and a significant amount of intellectual effort. I ask myself what she would think, how would she react if she read my blog, and my conclusion is this: she would cast the evidence aside, and brand me the biased, irrational one. When it comes to resolving cognitive dissonance, only the most intellectually honest succeed.
Achenbach, J. (2015). “Why Do Many Reasonable People Doubt Science?” National Geographic. from http://ngm.nationalgeographic.com/2015/03/science-doubters/achenbach-text.
Fecht, S. (2014). “6 Warning Signs That a Scientific Study is Bogus: The power of peer review, correlation vs. causation, and more ways to tell that a study isn’t what it appears to be.” Popular Mechanics. from http://www.popularmechanics.com/science/health/a10339/6-warning-signs-that-a-scientific-study-is-bogus-16674141/.
Gerdes, L., et al. (2013). “HIRREM™: a noninvasive, allostatic methodology for relaxation and auto‐calibration of neural oscillations.” Brain and behavior 3(2): 193-205.
Ginsberg, J., et al. (2015). “Treating the mind to improve the heart: the summon to cardiac psychology.” Frontiers in psychology 6.
Lee, S. W., et al. (2014). “A bihemispheric autonomic model for traumatic stress effects on health and behavior.” Frontiers in psychology 5.
Newsome, W. (2006). “Of two minds: A neuroscientist balances science and faith.” Stanford Medicine Magazine. from http://sm.stanford.edu/archive/stanmed/2006summer/newsome.html.
Novella, S. (2007). “Neurofeedback and the Need for Science-Based Medicine.” Neurologica. from http://theness.com/neurologicablog/index.php/neurofeedback-and-the-need-for-science-based-medicine/.
Novella, S. (2008). “Brainwave Entrainment and Marketing Pseudoscience.” Neurologica. from http://theness.com/neurologicablog/index.php/brainwave-entrainment-and-marketing-pseudoscience/.
Shermer, M. (2002). Why people believe weird things: Pseudoscience, superstition, and other confusions of our time, Macmillan.