New Instagram Team To Address Platform's Mental Health Issues

By Victoria Kim 04/05/18

Instagram is installing a "wellbeing team" to make the app a safer place for users.

person holding a phone opened to Instagram app

The photo-sharing app known as Instagram is taking on the trolls, by addressing the wellbeing—and inevitably, mental health—of its community of users.

The company confirmed that a new “Wellbeing Team” will focus on just that—the wellbeing of users. “[The team’s] entire focus is focusing on the wellbeing of the community,” said Eva Chen, who manages Instagram’s fashion partnerships, at a recent CornellTech event at Bloomberg. “Making the community a safer place, a place where people feel good, is a huge priority for Instagram. I would say one of the top priorities.”

A report by the Royal Society for Public Health (RSPH), a British charity dedicated to improving public health and wellbeing, ranked Instagram No.1 in being “the most detrimental to young people’s mental health and wellbeing.”

But the multibillion-dollar company seems to be moving in the right direction. It’s at least having a better time than its parent company Facebook has had in the last month.

Back in 2016, by which time Instagram had become a sort of haven for pro-anorexia and pro-self-harm posts, the company added a new feature that allows users to anonymously flag posts about self-harm or other mental health concerns.

Instagram already reviews these anonymous reports, after which they connect the user to resources that offer help.

The company also all-out banned certain hashtags related to self-harm or negative body image. But prohibiting tags like #thinspo only spawned many more similar tags.

According to Quartzy, an Instagram spokesperson said the company has “reassessed priorities,” and has released new tools to help promote a positive social media experience.

“These features are mostly content moderation tools, like offensive comment filters that automatically hide inappropriate comments (without deleting them), as well as giving users the ability to create their own comment filters,” Quartzy reported.

By attempting to identify users who may be struggling, Instagram covered one key recommendation from the report by the Royal Society for Public Health.

The RSPH also recommends that social media platforms alert users when they’ve been logged on for too long, and that platforms clearly mark images that have been digitally manipulated, with “a small icon or watermark,” to counter the negative effect that too much exposure to airbrushed images can have on a person, especially in young people.

Please read our comment policy. - The Fix

Victoria is interested in anything that has to do with how mind-altering substances impact society. Find Victoria on LinkedIn or Tumblr