Facebook Users Promoting Harm Reduction Face Bans And Deleted Pages

By Lindsey Weedston 07/15/19

Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

Image: 
Facebook user about to log on to promote harm reduction

A new Facebook campaign to combat the opioid crisis appears to have unintentionally targeted harm reduction efforts on its own social media platform as ads for fentanyl-testing kits result in bans and pages created by harm reduction organizations are deleted.

A report by Vice interviewed multiple individuals who say they have been targeted by the platform in ways that are hampering their efforts to prevent overdose deaths.

Facebook recently teamed up with Partnership for Drug-Free Kids for the “Stop Opioid Silence” campaign, but their efforts to fight drug trafficking on the massive social platform looks to have created more opioid-related silence.

This is causing serious problems for organizations, such as Southside Harm Reduction Services, that post warnings on their Facebook pages about local batches of illicit drugs that had been found to contain fentanyl, the extremely potent opioid responsible for many of the overdose cases and deaths in recent years. These posts are being rejected or experiencing “reduced distribution,” meaning that those that do get posted are not being seen by the community.

In one particularly severe case, the social media manager of a company called BunkPolice, Colin Marcom, was permanently banned from placing any ads on Facebook after he used the platform to advertise BunkPolice’s fentanyl testing kits.

These simple kits can easily test for fentanyl, which is a synthetic opioid often mixed in heroin, cocaine, ecstasy, and other common illicit drugs.

“Facebook banned my personal account from ever being able to place ads on Facebook again, b/c of an ad, with this picture, that they approved for $20 & it ran for 7 days,” wrote BunkPolice in a Twitter post. “7 days, no warning - right to suspension - I submitted a sensible appeal, they said I was promoting drug use.”

While harm reduction efforts like these have been repeatedly found to save lives without increasing drug use, as some people feared, Facebook seems to be treating these efforts like they’re a drug-trafficking scheme. To make matters worse, recent attempts to appeal bans and deleted posts and pages have been rejected. 

After Vice contacted Facebook for comment, multiple posts from harm reduction pages that were previously flagged and deleted were restored, suggesting that the problem may be automated. It’s also possible that the vague language in their “regulated goods policy,” that allows for posts about drug use “in a recovery context,” was misinterpreted by employees who reviewed the appeals. 

An extended report published by The Verge earlier this year found that Facebook moderators are chronically overworked, confused by ever-changing policies, and in some cases have been diagnosed with PTSD from viewing so much extremely disturbing content.

According to the report, these moderators spend less than 30 seconds on an average flagged post before deciding whether to allow or delete.

Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

Please read our comment policy. - The Fix
Disqus comments
lindsey-weedston.jpg

Lindsey Weedston is a Seattle area writer focused on mental health and addiction, politics, human rights, and various social issues. Her work has appeared in The Establishment, Ravishly, ThinkProgress, Little Things, Yes! Magazine, and others. You can find her daily writings at NotSorryFeminism.com. Twitter: https://twitter.com/LindseyWeedston

Disqus comments