Re-examining the Data in Concussion Research

Systems Science Professor Martin Zwick asks if data mining can reveal new insights into treatments for concussions.

Concussion

Headlines and talking heads alternately praise and castigate organizations such as the NCAA, NFL, and MLS for taking steps to prevent concussions and not doing enough to care for those with brain injuries. Bloggers hail the arrival of high tech impact sensors in helmets as the "Holy Grail" of safety monitoring, while Congress grills league officials on what they knew about football safety.
 
Every year concussions, also known as mild traumatic brain injuries (MTBIs), resulting in 2.2 million trips to emergency rooms in the U.S. Athletes of high school age and younger suffer most injuries. For most, recovery comes in a matter of days. But MTBIs have been shown to cause permanent mental impairment, even in athletes as young as in their early 20s.
 
Despite an investment of over $800 million by the U.S. Department of Defense and more than ten years of research, clinical trials have failed to establish intervention effectiveness for brain trauma. Researchers suggest that this lack of definitive results comes from the difficulty in designing and conducting cognitive tests that show the severity of MTBIs and the efficacy of potential treatments.
 
Portland State and Oregon Health and Sciences University and researchers at the Department of Defense and Stanford University are leveraging their complementary expertise to address such complex issues.
 
Although trained as a biophysicist, PSU Systems Science Professor Martin Zwick is, by his admission, no expert on concussions or other brain injuries. His research explores artificial life, machine learning, exploratory modeling, systems theory, and philosophy. Using his systems science expertise, Zwick looks for ways to uncover potentially useful information hidden in large datasets.
 
When, at a meeting of the Brain Trauma Foundation a few years ago, he saw a presentation about the problem of inconclusive data in concussion studies, Zwick began a conversation with Dr. Nancy Carney. Carney, who directs the Brain Trauma Foundation Center for Guidelines Management at OHSU, learned of previous analyses Zwick had done of datasets in diabetes studies and thought his unconventional approach of exploratory modeling might lead to the discovery of previously unrecognized relationships in MTBI datasets and point researchers in the direction of effective interventions in the treatment of brain trauma.
 
To analyze the MTBI records and similar datasets, Zwick developed a computational modeling program that uses information and graph theory to relate the whole of a dataset to its parts. The software identifies complex relationships among two, three, or more independent variables and ranks the likelihood that these relationships can predict a dependent variable's behavior.
 
Imagine you're conducting a clinical trial on a new therapy, for instance, a combination of drugs and electrical stimulation, which earlier studies suggest will improve MTBI patient recovery. You're funded to test whether other factors, such as diet or mental attitude, will influence the treatment's efficacy. You thus have your subjects fill out lengthy questionnaires generating massive behavioral datasets that will later help assess your assumptions.
 
Now imagine that the data you've painstakingly collected is inconclusive, neither proving nor disproving your hypothesis. Using Occam, a Discrete Multivariate Modeling software tool he developed at Portland State, Dr. Zwick can potentially identify key factors you were unaware of or were not looking for, given the specific questions your study sought to answer.
 
"[This] is a great tool to use to look at data from a variety of perspectives. The researchers sharing data with me look at particular sets of questions. They're working from their theories and assumptions that suggest one variable ought to predict another, or that two or more data points ought to be associated in a particular way. Their studies are designed to confirm or disprove those assumptions. In my work, I get into the data and explore it, checking to see if there's anything else we might be able to learn."
 
This kind of data mining is often applied to data-intensive fields like genomics, astronomy, particle physics, or the financial sector. Still, as Zwick notes, it is a tool that is increasingly being used to analyze smaller datasets collected in clinical trials and social science settings. In the case of MTBI datasets, there is the real possibility Zwick will uncover new evidence showing what works and doesn't work in clinical trials, or highlight unrecognized relationships between physical, social, or environmental factors and the efficacy of particular interventions.
 
Zwick cautions that his approach is unlikely to lead to any game-changing revelations about MTBI mechanisms; any new insights would need to be further tested and confirmed in follow-up studies. Nevertheless, during their first look at the data, he and his OHSU collaborators found that a widely used cognitive test wasn't an incredibly useful measure of brain injury, which will certainly inform how the next study is designed.
 
The collaboration between Zwick and concussion researchers at OHSU and the Brain Trauma Foundation is emblematic of ways PSU and OHSU can leverage complementary strengths to address complex health issues. It's not likely that this research will lead all young athletes to give up their dreams of playing contact sports. Nor will advances in protective gear eliminate all injuries from head-jarring hits. But joint investigations like these into concussion treatments may mitigate America's favorite pastime's long- and short-term physical and financial costs.