You are here
Home > Funny and Odd > Facebooks New Plan May Curb Revenge Porn, But Wont Kill It

Facebooks New Plan May Curb Revenge Porn, But Wont Kill It

Facebook needed to move againstnonconsensual porn. The scandal surrounding Marines United, a secret Facebook group of 30,000 servicemen who shared dozens of women’s private images without permission, proved that. Now, the social media giant finally shuffling in the right direction.

On Wednesday, Facebook released new guidelines for how it plans to curb the sharing of nonconsensual porn, which somecall “revenge porn,” whether revenge was the motive or not. Under Facebook’s new bylaws,if revenge porn pops up in your newsfeed and you report it, a team of (unfortunate) Facebook employees will now vet the image, and implement photo-matching technology to make sure it doesn’t spread any further. The protocolworks across Facebook Messenger and Instagram, too. But while this policy is a great start—and will give Facebook some much-needed legal cover if federal law ever criminalizes nonconsensual porn—the only way to kill revenge porn is to stop it being posted in the first place.

Facebook’sphoto-matching technology should be a huge boon to revenge porn victims. “The constant challenge for the victim is reporting each post that shares their photo,” says Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. “So we’re really excited about this. It will alleviate some of that burden.”

Once someone reports an image, Facebook can be reasonably confident photo-matching technology will catch the rest. (Even better, a pop-up will notifythe would-be poster that the photo they’re about to share is revenge porn.) It’s basically the same hashing technology that powers Google’s image search, and Facebook already uses something similar to help identify child pornography. Circumventingitis difficult: a person would need tomaking significant visual changes to the original image, like adding stickers or filters or pasting the person onto a new background to bamboozle the tech. “If it’s just some dude uploading a photo from his phone, this should work really well,” says Jen Golbeck,a computer scientist at the University of Maryland.

Legally speaking, it’s a good move for Facebook too. If sharing revenge pornography becomes a federal crime—whichFranks and congressionalrepresentativeJackie Speier (D-California) areworking on—Facebook is going to need to find some shelter. As with criminalized contentlike child porn or terrorist videos, online intermediaries like Facebook would belegally obligated to report revenge porn to the powers that be, retainevidence, and take good faith measures to stop its spread. The photo-matching technology would help the company do that. “This is one of the ways Facebookcould signal they are trying to address this problem in the same way as child pornography,” Franks says.

The key word there, though, is “trying.” Like child porn or doxxing, revenge porn inflictsdamage the first time it’s shared, so removing something after it’s already beenpostedis a second-best solution. And this measure wouldn’t even catch the nonconsensual porn shared within a closed ecosystem like theMarines United group. “We have to work preemptively. We’ve got a real problem with peoplesharing these images in alikeminded group,” Franks says. “In that situation, a woman might not find out her photos had been shared for 8 or 9 months.” Or ever.

Nor doesreporting nonconsensual porn on Facebook stop the image from spreading elsewherethe internet. When one Marines United member reported that group for itsbehavior, other membersjust moved the revenge porn party over to Google Drive. Franks expects other major tech platforms like Google and Twitter to announce similar policies soon, but it will take those systems being interoperable—or at least communicating with each other—to make a dent in this issue.

The only thing that could really stop a group like Marines United is an AI that scans images prior to posting. According to a Facebook spokesperson familiar with the efforts, the company is heading in that direction, and the only thing holding itback is coaching theAI to understand contextthe thing that makes a photo revenge porninstead of, say, the “napalm girl” or a work of modern art. “Even if it’s right 90 percent of the time, you really don’t want to catch stuff that’s legitimate,” Golbeck says. “It’s probably going to take more fine tuning to avoid those user concerns.”

The question of context also brings up another user concern: censorship creep. In the past, groups like the ACLUhave opposed revenge porn laws not because they condone the behavior, but because the overly broad laws would have also criminalizedconsensual porn, or even photos of Holocaust victims. Andif every major social media platform takes down the sameharmless image at the same time, it could be a public-relations disaster.

To avoid those fiascos,the path forwardrequires bothcooperation and codification. “With mushy categories like ‘extremism,’ you run the risk of censoring political speech or dissent,” says Danielle Citron, who teaches law at the University of Maryland. “But if the definition of nonconsensual porn is narrow enough, we could have a shared industry database that avoids the pitfalls.” Beyond agreeingto a specific definition ofrevenge porn, Citron says, companies need toeducate their content moderators about the possibility of unintentionalcensorship, so thatmissteps don’t happen.

Facebook has created a template for others to expand and iterate on. That’s a great first step. Now comes the rest of the journey. Andfinding consensus on limiting internet speech in Silicon Valley? That might be even harder than hunting down revenge porn.

Read more: http://www.wired.com/

Top