How Two Marines Helped Bring Down Revenge Porn on Facebook

In September 2016, John Albert, a bearded, 30-year-old retired Marine from Greenville, South Carolina, noticed something shocking on Facebook. He had been invited to join a Facebook group called Marines United, and began to scroll through the feed. He saw images of naked female soldiers, civilians, and graphic discussions of sexually assaulting these women.
“Where is she at?” he recalls the comments saying. “‘I’d rape her, I’d bend her over, I’d make her choke.’ There was a picture of a girl standing outside of a building, and they were saying this is what I would do to her, which unit, and which base is she at?”
Disturbed, Albert read through two days of timelines, and more than half of the posts featured nude images and sexually harassing content.
“I got really frustrated and offended,” Albert says. “In the military, we’re supposed to be the people who put our lives on the line to protect our country. But when you become a predator – you’re not protecting your country anymore, you are someone this country needs protection from.”
Albert took screen grabs of what he saw, and reported the group to Facebook. About a week later, he was notified that the group had been shut down. “I thought Marines United was over,” he says.
Then, early last March, investigative journalist and former Marine Thomas Brennan reported that Marines United was again trading photos of nude women in a private Facebook group. Some of the photos were taken secretly, others were hacked, and a few more came from former lovers. Members again suggested sexually assaulting their colleagues, and also shared a Google document that listed women’s names and addresses.
These pages sent ripples through the military community. In an op-ed, a former Marine sergeant claimed the Marines United incident reflected a larger, anti-female culture in the armed forces. Then two women, Erika Butner and Marisa Woytek, whose photos appeared on the Marines United page, spoke out at a press conference with their attorney Gloria Allred. “As a rape survivor, I can tell you that this exact behavior leads to the normalization of sexual harassment and even sexual violence,” Butner said. Meanwhile, Commandant of the Marine Corps General Robert Neller, was called before Congress and excoriated by Senator Kirsten Gillibrand: “If we can’t crack Facebook,” she said. “How are we supposed to be able to confront Russian aggression and cyber hacking throughout our military?”
This comes at a time when Facebook users are debating the role the company should play in policing content ranging from fake news to violent videos on Facebook live. Non-consensual adult pornography, often called revenge porn, is notoriously hard to police. Although 35 states now have laws outlawing non-consensual pornography, it is still not federally illegal. Internet platforms cannot be prosecuted for anything users post, and are only legally required to remove content that violates federal law. Internet platforms are not legally responsible for removing this content, but most, including Facebook, have policies in place to address it. However, victims and advocates say despite these policies it is difficult to get non-consensual porn removed from Facebook, and tech policy experts are calling on the company to take more proactive steps to combat the issue.
“I am just sticking up for people that have been victimized,” Albert says. “I would be doing the same thing if this was happening to ginger kids with glasses. Right now, it is women, a year from now it could be gays in the military, three years from now it could be Asians in the military. There is always going to be some group of people that is being oppressed or picked on.”
Sitting at his desk in South Carolina, Albert watched incredulously as Marines United communities full of non-consensual pornography continued on Facebook despite the national attention on the issue. Once the original Marines United community was shut down, Marines United 2.0 and 3.0 appeared. And he watched this happen again and again. A page would be created, he would notice it, and report it repeatedly to Facebook. After several weeks and many reports, the page would finally be removed. But then the same site would pop up again, under another name or the same name, with similar administrators and group members.
“Each page had the same type of activity,” he says. “It showed pictures of female service women, and it is people talking about sexually assaulting them, and talking about trying to find the women’s contact information so they can go find them.”
Frustrated he couldn’t get Facebook to remove these pages, Albert pulled out his laptop, brewed some coffee and got to work. During two tours in Afghanistan, Albert served in intelligence positions and gathered strategic information on enemy combatants. He would track suspicious persons, debrief sources from the field and analyze the intelligence.
“My goal is to find these groups and to shut them down, and to keep on chasing these guys, forcing them to create groups that are more secretive,” Albert says. “Once we get them far enough in the shadows their impact on all the people they victimized will be minimized.”
He launched a mission with the working title Camelot Global. He recruited anonymous sources from across Facebook, each named after Knights of the Roundtable, who were also furious at what they saw. Together, Albert and his team would infiltrate the groups and monitor their activities to see if they were posting non-consensual porn.
Albert doesn’t sleep well due to the five-screws in his shoulder, and two busted vertebrae in his back. But every day, when he woke up, he reviewed the overnight activity of his targets. He would then document the new content, and when he’d gathered enough intel, notify Facebook.
“If someone is aware this is happening on their website they have a moral responsibility to do something about it,” Albert says. But he still didn’t know how to get Facebook to remove the pages.
Privacy issues – particularly when it comes to non-consensual pornography – have been a problem for Facebook for a long time. In 2009, Holly Jacobs discovered that her Facebook page had been hacked and her profile photo was changed to her naked image. She went on to find her photos popping up on sites with her name, contact information and home address. “I was so afraid that someone would physically stalk me,” Jacobs told the Today Show in 2013, and eventually decided to change her name.
But despite the increased public awareness, non-consensual porn persisted on Facebook. In 2013, California Congresswomen Jackie Speier learned non-consensual pornography pages were being shared among Marines, on pages like “Just the Tip, of the Spear” and “POG Boot Fucks.” The Congresswoman contacted military officials, and expressed concern about the message these pages sent to active female soldiers. “The ‘humor’ expressed on this page … contributes to a culture that permits and seems to encourage sexual assault and abuse,” she wrote at the time. Speier said in a statement in March, “Nothing has changed.”
Similar incidents continued to occur, beyond the military and around the world. In March 2015, a Penn State fraternity, Kappa Delta Rho, was suspended for posting nude photos of sleeping and unconscious women in sexual positions. In the United Kingdom, a British student pled guilty to posting explicit photos on Facebook to blackmail young women into sleeping with him. In Australia, the Melbourne Men’s Society, a Facebook group, shared nude photos of young women among a community of 7,000 members. In Ireland, a court ruled that a 14-year-old can sue Facebook after naked photographs appeared repeatedly on a “shame” page on the site.
Following the Marines United incidents, on April 5th, 2016, Facebook announced a policy change. The new policy, posted by Antigone Davis, Facebook’s head of Global Security, is similar to the old policy. If users see an “intimate image” that appears to have been “shared without permission” they can click on the button next to the post, and it will be reviewed. If it violates community standards, the team will remove it and may disable the account which posted the image. Under the new policy, the company will now use technology to prevent flagged images from appearing again on the site.
“It’s wrong, it’s hurtful, and if you report it to us, we will now use A.I. and image recognition to prevent it from being shared across all of our platforms,” Zuckerberg said in a recent Facebook post.
A week after Zuckerberg made this announcement, Albert was still spending his days and nights trying to get Marines United pages removed from Facebook. He meticulously documented each one with screenshots showing how these groups feature nude women, and appeared to be non-consensual pornography.
But each time Albert reported these pages, he received a message from Facebook that these images and groups did not violate the company’s community standards.
“I don’t know how their process works, but whoever is handling this is not doing a good job,” he said at the time. (Sources at Facebook later acknowledged that mistakes were made, but would not answer specific questions about Marine non-consensual porn on their website.)
At a loss, Albert reached out to Erin Kirk-Cuomo, another former Marine who runs the Facebook group Not in My Marine Corps. The group provides a support network for women affected by Marines United, and other forms of gender-based violence in the military. Kirk-Cuomo says she was equally frustrated by Facebook’s response to non-consensual porn on the site.
“It is a tough subject because we all believe that first amendment rights are something that we hold dear in this country,” says Kirk-Cuomo. “But when it goes into the grounds of encouraging rape, and cyberbullying it is something perhaps Facebook needs to take a harder look at.”
As Kirk-Cuomo explains, Facebook’s policy means that the only way for women to get this content removed is to know about it, or rely on a good Samaritan to flag it.
“One thing we were hearing from so many people who message us is: ‘How can I figure out if my photo was on Marines United? How do I know?’ I have to tell them, ‘I don’t know,'” Kirk-Cuomo says. “You have to have a friend that was in Marines United, saw your picture and sent it to you. And that’s what’s so difficult, some of these people that had their photos on the site may not even know it.”
And even if these good Samaritans reported the non-consensual porn they found on Facebook, they couldn’t get the content removed. “We had been reporting Marines United 214 for upwards of a month, almost daily,” Kirk-Cuomo says. “It was time to put our foot down and say enough is enough.”
Armed with a copy of Lean In, Kirk-Cuomo wrote an open letter to Facebook COO Sheryl Sandberg. Kirk-Cuomo wanted Sandberg to know the non-consensual porn policy wasn’t working, the pages weren’t being removed, and they were still causing harm. “You said in your book, ‘We cannot change what we are not aware of, and once we are aware, we cannot help but change.’ I am making you aware.”
Less than 48-hours later, Kirk-Cuomo got an email from Sandberg. “She expressed very deep concern over this non-consensual photo sharing issue,” Kirk-Cuomo says. “She assured me that Facebook as a company, as well as her, are deeply troubled by this.”
Sandberg put Kirk-Cuomo in touch with individuals at Facebook who were working directly on this issue. The weekend after Kirk-Cuomo’s call, the five Marines United pages Albert had initially reported were taken down, and several users were removed from Facebook.
Kirk-Cuomo even got the direct contact for Antigone Davis, Facebook’s Head of Global Safety Policy, and was encouraged to contact her if need be. Davis tells Rolling Stone that Facebook is committed to keeping these photos off the Internet. “These images are harmful, and for the sake of victims everywhere, we want to keep them off the Internet,” she said in a statement to Rolling Stone. “On Facebook, we’re partnering with leading groups like the Cyber Civil Rights Initiative and preventing reported images from being shared on Facebook, Messenger and Instagram. We’ve helped orchestrate efforts to create a central resource to make it easier for victims to report and have content removed from other services.”
But Mary Anne Franks, a legislative and tech policy director at the Cyber Civil Rights Initiative, says she thinks Facebook still needs to be more proactive. She believes the secret to preventing non-consensual pornography is keeping it from being uploaded in the first place – when the most damage is done. As soon as nude images are posted online, the shaming begins, and the photos can spread easily across various platforms.
“[The new policy] is a huge step in the right direction. But I think it is only one part of the solution,” Franks says. “If you really wanted to make sure that content like this didn’t have a harmful impact, you’d have to go one step further and make sure it didn’t get uploaded the first time. Because the problem with all of these measures is you have to rely on the victims themselves or someone who might be looking out for the victims to report this content.”
Franks points out that the tech industry has been dealing with tedious problems like this throughout its development, whether it was email, spam, or phishing.
“If any of us remember what it was like 10 years ago, spam was a huge problem on every platform and in every e-mail and now it’s virtually nonexistent,” Franks argues. “It took algorithms, and it took taking stuff out before it got to people. So, we know companies are fully capable of intervening against these very tricky issues if they are incentivized to do so.”
Facebook declined to make Zuckerberg available for an interview, but in a February Facebook post he explained issues like non-consensual pornography are challenging because of the sheer size of the site. He announced that the company is currently researching systems that can flag problematic photos and videos.
“There are cases of bullying and harassment every day, that our team must be alerted to before we can help out,” Zuckerberg wrote. “These stories show we must find a way to do more.”
And cyber security experts say it would be possible to use technology similar to facial recognition software to scan for nudity as content is being uploaded. “It’s not unreasonable that a technology could be used to flag various body parts – obviously facial recognition recognizes the face,” says Robert Siciliano, who runs the company ID Theft Security. “It is not unreasonable that a derivative of that technology could recognize a chest and breast and begin to determine that all of that is skin color and to flag that and delete it in the tube prior to it going live.”
Lawmakers are also proposing solutions. Representative Speier has proposed a federal bill that would criminalize revenge porn, and a bill that would make it a crime for members of the military to share this content. These laws would not make Facebook, or other platforms, legally liable. But they would put more pressure on them to remove the content – just as they legally have to remove child pornography, and copyrighted content when notified.
“It is all part of a continuum,” Speier says. “Before we had the web, harassment took different forms. It was more in person. But now, we really have to take steps to modernize the laws to reflect the way this conduct continues to occur.” (Facebook has publicly supported this bill.)
As Albert continues to try to get these pages removed permanently, he has become a target in the military community. He is regularly ridiculed on Facebook, and has received death threats.
“They said they were going to come shoot up my house,” Albert says. “Mentally I’m treating this just like Afghanistan. When you go to war you have to be realistic and fatalistic about all the possibilities, and that’s what I’ve been doing about all the bad feedback.”
In the meantime, Albert remains focused on the mission at hand. He still spends his days monitoring Facebook closely to make sure these groups don’t reemerge.
Generally, Albert has noticed an improvement since Sheryl Sandberg was notified of the problem. Just this week, Zuckerberg said the company would hire 3,000 more people to review and remove violent images and videos.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg said in a Facebook post. “We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”
But, Albert says despite the increased attention, the system is far from perfect. Two weeks ago, Erin Kirk-Cuomo’s notified her contact at Facebook that group, JTTOTS.2, was sharing photos displaying nudity and discussions of graphic violence. (Representative Jackie Speier reported the related group JTTOTS to military officials in 2013.) It took the company a week to remove the images, and the group is still on Facebook.
“If Facebook could actually shut down these pages and stop them from coming back again all those girls would no longer be victimized on Facebook,” Albert says. “Maybe they would be victimized somewhere else but they would be safe from this group that was identified.”
Update: This story has been updated to include that Facebook has supported Representative Speier’s federal bill to criminalize non-consensual porn.
More News
-
Tennessee Moves to Permanently Expel Police Officers Who Beat Tyre NicholsÂ
- Police Accountability
- By