Facebook, who recently vowed to take opioid sales and promotion on their platform more seriously in the wake of several Congressional hearings about their platform, has now decided to help fight addiction. For Facebook users, this means that when they are searching for the drug via the platform, they will instead find resources to help them with a substance abuse disorder.

Facebook made the announcement just a week before an opioids summit hosted by the FDA. The platform has been accused by lawmakers of helping to facilitate illegal drug transactions, and the summit will address unlawful drug sales and trafficking online. Facebook told the Hill that it has been working with Facing Addiction, a recovery advocacy group, and the government organization Substance Abuse and Mental Health Services Administration, for months on ways to combat online drug sales.

How Does the Diversion Work?

The way that the simple works is so simple it’s difficult to understand what took so long for Facebook to take action.

When users enter an opioid-related request into the search function, they will receive a message asking if the user would like information on treatment resources. They will also be given a link to an addiction resource hotline.

“We look at this as one of a number of steps that we’ve taken and will be taking to find ways to connect the community on Facebook with the resources they need,” Avra Siegel, Facebook’s policy programs manager in charge of the program, told reporters.

“Every time we’re made aware of content on our platform that violates these standards, and if Facebook is in any way facilitating activity like drug sales, we remove it,” Siegel said. “We have a number of ways that we’ve tried to prevent the opportunity for that to occur. I think what’s really important is we have a very proactive, iterative process.”

Will it Work? Time Will Tell

This policy is another way that Facebook has decided to try to automate rather than physically monitor content. In the past, when coping with controversy, Mark Zuckerberg, CEO, and Founder, has talked about the use of algorithms. Facebook has some algorithms meant to flag and censor content, but with those algorithms, new types of problems have arisen. For example, breast cancer survivors and breastfeeding mothers have often been banned from the platform to prevent them from showing content that computers deemed pornographic.

Drug users have been using euphemisms and “code words” to obtain their drug of choice since Prohibition in the 1920’s, when users referred to alcohol as “hooch” It’s hard to imagine that online drug dealers drug dealers won’t find a way to circumvent algorithms; they are always finding new ways to push their products. Time will tell if Facebook is capable of banning code names as quickly as drug dealers can come up with them. Nevertheless, the effort is a start from what has been a very slow response to do anything at all to circumvent social media drug dealing.

“Every time we’re made aware of content on our platform that violates these standards and if Facebook is in any way facilitating activity like drug sales, we remove it,” Siegel told reporters. “We have a number of ways that we’ve tried to prevent the opportunity for that to occur. I think what’s really important is we have a very proactive, iterative process.”