Share This Page
Translating Big Data Models to Stem Suicide
September 20, 2022
Suicide is an ongoing problem in the United States. There were an estimated 1.2 million suicide attempts in 2020 in the US, according to the Centers for Disease Control and Prevention (CDC), and nearly 46,000 people completed their attempt. The numbers are more dire in the US military. Tens of thousands of active-duty personnel and veterans have died by suicide in the post-9/11 era—four times the number of service members who died during military operations across the same time period.
To stem this flood, the US Army, US Veterans Health Administration (VHA), and other healthcare organizations have turned to big data models to help better predict who may be at risk for suicide (See Predicting Suicides: Beyond STARRS). And a recent JAMA Network paper highlights the success of the VHA’s big data model, called the Recovery Engagement and Coordination for Health-Veterans Enhanced Treatment (REACH VET), in enhancing supportive care and risk reduction.
But taking such a research model and translating it into new policies to guide mental health care across a massive healthcare system is a knotty task. It’s one thing to come up with a model in a lab or computer; how will it work in the real world? Despite the host of challenges, it’s worth it for researchers and practitioners to come together to formulate a data-driven clinical plan to better address suicide in the military and veteran community and beyond, said John McCarthy, Ph.D., MPH, director of the VHA’s National Primary Care–Mental Health Integration Evaluation Center.
“This program has allowed us to engage with clinicians around patients at high risk for suicide and give them direction and a tool they didn’t previously have,” he said. “It’s had positive effects on patients in terms of engagement and their receipt of suicide prevention treatment—and that’s work that’s worth doing.”
Doing the Least Harm
More than a decade ago, the US Army and the National Institute of Mental Health partnered to develop an algorithm to determine which service members are at highest risk of attempting suicide. That pioneering work has generated more than 120 published research papers and demonstrated that big data techniques can identify key variables to help stratify a very diverse pool of people in terms of suicide risk. Despite this “steady and prolific” stream of positive results, said Kenneth Cox, M.D., MPH, an epidemiologist who acts as the Army’s science liaison with the STARRS project, there have been ongoing challenges with translating their findings into something that can effectively drive policy.
“Historically, whenever you are doing translational research, there are challenges involved with taking research findings, determining if they are valid, and then deciding if they provide insights into something you can do that is reasonable, legal, and ethical,” he said. “The next step is then deciding how to do it—and ensuring whatever it is you do decide to do won’t cause unintended harm to the very individuals you are trying to help.”
That takes time. Cox said it usually takes about 10 years before a new drug or new psychiatric approach, proven in a research setting, becomes routine clinical practice. While the STARRS project has already characterized more than a dozen key variables linked to suicide, project members needed to “move the issue upstream”: To develop practical ways to intervene before a service member faces a psychiatric crisis. To govern that process, the Army created the STARRS Research Advisory Team. Both Cox and his fellow STARRS collaborator, Robert Ursano, M.D., a professor of psychiatry and neuroscience at the Uniformed Services University, are part of this committee.
“As each new published manuscript comes out from the project, we look at the findings and attempt to determine whether they are relevant to a current problem the Army is facing,” said Cox. “When we’re talking about self-harm, it is almost always relevant. But then the next question is, is there something we can do? If there is, do we have the resources for it? Is it evidence-based—and can we expect improvement if we do implement it? We take a very deliberate, careful look at all these questions and then, when it makes sense, we recommend actions that the Army could take.”
Ursano said the Army has already implemented several education and awareness programs to help commanders and clinicians identify potential issues that could lead a service member to suicidal ideation. But one of the biggest challenges they face is that the Army is also the service member’s employer—and one of the largest employers in the United States—which makes recommending service-wide policy changes more difficult.
“It’s very hard to identify a rare event—and suicide remains a rare event—in a large population,” said Ursano. “Because we are abundantly cautious and don’t want to cause harm, we need to understand, as an employer, there is a potential for these findings about increased risk to have significant impact on an individual. If identified, they may not be able to be recruited or continue service. It means we have to set up a very strict approach about what you do when an individual is identified as high risk and what actions are appropriate for command to take. That impacts what kind of policies we can recommend.”
For example, a service member identified as high risk may be stigmatized by fellow soldiers, which can make things worse, or could be passed over for promotion because they were relieved of duties that would have helped them rise through the ranks. It is hard to balance the harm that might add to a person’s already precarious mental health.
Support after Service and Beyond
The VHA’s big data prediction model, REACH VET, rolled out in 2017. It identifies people potentially at risk so their local providers can provide additional mental health and support services to them. REACH VET has proven to reduce the risk of nonfatal suicide attempts, inpatient mental health admissions, and emergency department visits for those in the veteran community, thanks to increasing treatment engagement with those at highest risk. But implementing the program ran into its own obstacles. McCarthy credits the leadership of the VHA, as well as critical local facility “champions,” for helping the greater organization embrace its use.
“Initially, providers were looking for information on program effectiveness. They wanted to know that using it was going to make a difference, and it was difficult for us to say that it would in the early phases of the project because we just didn’t have enough data about the program,” he said. “But our implementation team worked closely with local leadership at our different facilities, who then identified internal champions who would disseminate new results and patient feedback supporting the program. That additional information helped increase utilization.”
It is vital that these champions are honest and let providers know that models like REACH VET are not a miracle drug, said McCarthy. “Our clinicians are engaging with patients and working to address barriers and improve care,” he said. “Most of the people who are identified as high risk by our model, however, have been identified because they’ve been engaged with the VHA for care and have had a lot of mental health encounters. So, trying to assess the incremental benefit for these patients with the additional services provided by the program isn’t easy.”
In fact, the same studies that showed a five percent reduction in documented suicide attempts thanks to REACH VET did not identify any differences in suicide or all-cause mortality. Still, data-driven approaches to identifying high-risk patients can help healthcare providers and their troubled patients, McCarthy says. And he applauds organizations outside of the military or federal government that are trying to develop their own big data models to reduce the number of suicides.
“I think it’s important for health systems to track suicides and see what they can do to help enhance care to prevent them,” he said. “I know that organizations like Henry Ford and Kaiser Permanente are adopting these kinds of approaches. It’s my hope that, over time, we can really facilitate a data-driven assessment of suicide, as well as other adverse outcomes in patient populations, to really make a difference for patients who need our help.”
If you or someone you know is having thoughts of suicide or experiencing a mental health or substance-use crisis, please call or text 988 or chat http://988lifeline.org; all offer a 24/7 connection to confidential support. US military families also have a targeted support line (text, chat, and phone), dial 988 and then press 1 or click https://www.veteranscrisisline.net/