Sunday, July 14, 2013

Can Big Data solve the mystery of suicide??

Durkheim Project

13 hours ago

Facebook suicide

AP

Everywhere on the Internet, we're trailed by bots that inspect our searches and social chatter, attempting to predict what we're going to buy, watch or who we might date next. But in the middle of all that commerce-friendly jibber jabber, some people are saying, in not so many words, "I am going to kill myself." What if a computer program could spot those cries for help as well?

An ambitious new study is analyzing real-time data collected from Facebook, Twitter and LinkedIn accounts volunteered by veterans and active military personnel in order to develop a way to identify those who might be at risk of suicide. The Durkheim Project is a joint effort by predictive analytics firm Patterns and Predictions and the Veterans Education and Research Association of Northern New England, with support from Facebook and funding from DARPA, among others. By seeking linguistic cues from social media interaction, it may someday aid healthcare workers to identify and help patients before it's too late.

Men and women of the military are an all too rich source of research material. Everyday, 22 U.S. veterans take their own lives, an average of one every 65 minutes, according to a recent government report. In 2012, suicides among active members of the military reached 349, exceeding those killed on duty. Further, mental health experts say, the military culture makes those at risk less likely to ask for help or admit they need it.

Somewhere between the first thought of self-harm and the final act, Durkheim's project director Chris Poulin hopes to find "those subtle cues as they build up, but haven't become much of a problem that you can't change the fate," he told NBC News.

And it really is about subtlety.

"We do know factors that increase risk [of suicide], such as depression, but our best methods rely on individuals telling us," Dr. Craig J. Bryan, associate director of the National Center for Veterans Studies at the University of Utah and a Durkheim Project adviser, told NBC News. "A lot of people who die by suicide don't tell people they're thinking of it, so the difficulty is trying to figure out if there are other indicators that go around this problem of self report."

So to start with, suicidal patterns needed to be identified. Three groups were formed using 300 anonymous medical records from the Veterans Administration: patients who died of suicide, patients under psychiatric care and patients seeking medical treatment but not under any psychiatric care. From that trove of detailed doctors' notes, the project's computer scientists discovered the behavior unique to those patients who ended their lives.

Troubled relationships, debt, job loss, post-traumatic stress, health issues and alienation are common experiences for many United States veterans, as well as being risk factors suffered by many people who chose to take their own lives. But risk factors are not the same as warning signs ? which can be far less obvious.

According to Poulin, the project's consulting doctors who had treated veterans were surprised not at the common behaviors ? agitation, shame, self-hatred ? but that a computer program could detect often subtle clues from the medical reports.

It's not a simple matter of key words, either. While the results of this study won't be published for some time, and privacy prohibits discussion of the records themselves, the project's managers point out that the initial patterns are not explainable as words that simply cropped up while cross referencing text files. "The technology is trying to find the stories within the stories," said Gregory Peterson, a Durkheim Project spokesperson. "We look for certain kinds of language, and we analyze it."

Peterson said they can already identify certain clusters of language that do raise flags, for instance, those that surround notions such as loneliness or agitation. And when multiple clusters appear, there's greater concern. But "this is not some kind of scanning technology," Peterson told NBC News. "It's far more complicated."

In the new phase of the project, the foundation established by the initial 300 patient records will be tested against a much larger population of volunteers, with data coming from a variety of apps that will continuously upload the subjects' social media and mobile phone interactions. While there's no minimum number of participants, the project hopes to enlist up to 100,000.

The accumulated social information ? safeguarded by HIPAA standards of medical privacy and stored in the Geisel School of Medicine at Dartmouth's secure onsite database ? is analyzed by computer programs developed during the earlier phase. None of the data is shared with third parties, and the information is objectively safer than any stored on Facebook, OKCupid or Edward Snowden's flash drive. Similar to many medical tests, this stage is only for observation. There will be no diagnosis or intervention, though participants are welcome to drop out of the study at any time.

What goes unsaid in the Durkheim Project press release is the cold hard fact not uncommon with drug tests for cancer or other potentially fatal diseases: To get optimum results, the researchers are waiting for people to die.

"It's not a happy topic. I don't think we should back away from it just because it's taboo or so stressful," Poulin said, adding that "there's no negative intent there, no creepy objective. This is the cost of doing business in life-or-death research."

The Durkheim project takes its name from French sociologist ?mile Durkheim. In 1897, Durkheim published a groundbreaking case study, "Suicide," in which he used diaries and notes of suicide victims to detect patterns, including social isolation. Similarly, Poulin's team strives to use Big Data to find words, phrases or behavior that can identify the warning signs that, within society, may even go unnoticed by the potential victim.

If those warning signs can be deciphered, it may be "possible to intervene with a narrative that is more positive, and basically changed the track of someone's life," Poulin said.

But that's a difficult proposition. Even outside the military population, doctors miss cues. A review of studies by the Mayo Clinic found approximately 44 percent of people who commit suicide visited their primary care physician, and 20 percent visited a mental health care worker in the month before their deaths.

As with people who have never served, when veterans take their lives, "there is no single reason," Bryan said, explaining that he and other researchers are learning that military service is increasingly stressful in a war that's gone on for 10 years. And once soldiers leave the military, "they return to a society that as a whole is less engaged. Many veterans don?t fit in with the rest of the society. Nobody knows what they've gone through, and they begin to feel marginalized. It's when marginalization occurs that suicide risk increases."

Poulin hopes the project will eventually open to volunteers outside the military community. Groups that deal with teens and cyberbullying have already expressed interest, he said.

Dr. Lisa Horowitz, a staff scientist and clinical psychologist at the National Institute of Mental Health, isn't affiliated with the Durkheim Project, but she told NBC News she sees value in the information shared on social media.

"Social media is probably very important, especially with young people who use it as their forum to reach out to their community," she said. Nevertheless, she said, there's a danger in attempting to apply a single protocol for different groups at risk.

"With youths, the way to access suicide is to ask them directly," Horowitz said, citing a study that found adolescents prefer direct questions. "The majority of suicides tend to be committed by older men, and they have a tendency to deny suicide if you ask them directly."

So what works for the military will likely not apply to civilians, but the research ? much like social media itself ? is still so new that the possibilities themselves are yet to be discovered. "Just because it's complicated and these different groups are so varied doesn't mean there's no solution," Horowitz said. "It just means we have to be careful not to create one prevention style that's meant to work for everybody."

Helen A.S. Popkin is Deputy Tech & Science Editor at NBCNews.com. You can find her on Twitter and Facebook.

Warning signs of suicide
The more of these signs a person shows, the greater the risk. Warning signs are associated with suicide but may not be what causes a suicide.

  • Looking for a way to kill oneself.
  • Talking about feeling hopeless or having no purpose.
  • Talking about feeling trapped or in unbearable pain.
  • Talking about being a burden to others.
  • Increasing the use of alcohol or drugs.
  • Acting anxious, agitated or recklessly.
  • Sleeping too little or too much.
  • Withdrawing or feeling isolated.
  • Showing rage or talking about seeking revenge.
  • Displaying extreme mood swings.

What to do if someone you know exhibits warning signs of suicide

  • Do not leave the person alone.
  • Remove any firearms, alcohol, drugs or sharp objects that could be used in a suicide attempt.
  • Call the U.S. National Suicide Prevention Lifeline at 800-273-TALK (8255).
  • Take the person to an emergency room or seek help from a medical or mental health professional.

? Information provided courtesy of the National Strategy for Suicide Prevention.

Resources:

Source: http://feeds.nbcnews.com/c/35002/f/663301/s/2e9a1472/l/0L0Snbcnews0N0Ctechnology0Ccan0Ebig0Edata0Esolve0Emystery0Esuicide0E6C10A5970A91/story01.htm

meningitis bobby valentine bobby valentine Karrueche Tran miguel cabrera dodd frank Lark Voorhies

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.