(2018-03-20) How Facebook Groups Are Being Exploited To Spread Misinformation Plan Harassment And Radicalize People

Craig Silverman: How Facebook Groups Are Being Exploited To Spread Misinformation, Plan Harassment, And Radicalize People. One week after the mass shooting in Parkland, Florida, those searching on Facebook for information about the upcoming March for Our Lives were likely to be shown an active group with more than 50,000 members. Called “March for Our Lives 2018 Official,” it appeared to be one of the best places to get details about the event and connect with others interested in gun control. But those who joined the group soon found themselves puzzled. The admins often posted pro-gun information and unrelated memes and mocked those who posted about gun control.

The simple answer is they were being trolled. The more complicated one is that while Facebook groups may offer a positive experience for millions of people around the world, they have also become a global honeypot of spam, fake news, conspiracies, health misinformation, harassment, hacking, trolling, scams, and other threats to users, according to reporting by BuzzFeed News, findings from researchers, and the recent indictment of 13 Russians for their alleged efforts to interfere in the US election.

And it could get worse. Facebook recently announced that group content will receive more prominent placement in the News Feed and that groups overall will be a core product focus to help reverse an unprecedented decline in active users in the US and Canada.

Renee DiResta, a security researcher who studied anti-vaccine (anti-vax) groups on Facebook, told BuzzFeed News that groups are already a preferred target for bad actors.

*To her point, the recent indictment of the 13 Russians repeatedly cited Facebook groups as a focus of workers at the Internet Research Agency (IRA).

“By 2016, the size of many organization-controlled groups had grown to hundreds of thousands of online followers,” the indictment said.*

One part of the Russian effort on Facebook left unmentioned in the Robert Mueller indictment — and that has not previously been reported — is that Facebook’s group recommendation system encouraged fans of troll-run pages such as Blactivist and Heart of Texas to organize themselves into groups based on their shared affinity for a page.

There’s no question that groups offer value to many Facebook users.

Secret groups like FIN are completely invisible to anyone who is not a member, and Facebook users can only join if they are invited by a current member. This is different from closed groups, which can be found via search, show up on user profiles, and enable anyone to request to be added to the group.

But even a group that Facebook itself points to as an example of the product’s value has to wage a battle against bad actors. Omolola and other FIN admins work constantly to keep men and spammers out of the group.

Jennifer Dulski, the head of groups and community at Facebook, told BuzzFeed News that there is now a dedicated “integrity” team for groups, which includes people who specialize in security, product, machine learning, and content moderation.

DiResta sees a parallel, suggesting groups are at the same stage pages and the News Feed were prior to Facebook’s 2016 wake-up call about fake news: rife with manipulation and lacking proper oversight.

The global exploitation of groups

As the 2016 election moved into its final stretch, members of a Facebook group called Republicans Suck suddenly experienced an influx of new members who spammed the group with links to questionable stories on websites they’d never heard of.

Ison said he and others learned that the Facebook account of one of the group’s original admins had been hacked. With control of the admin’s Facebook profile, the hackers were then able to add whoever they wanted as admins, and remove the group’s original leaders.

BuzzFeed News previously documented how political Facebook groups were exploited to spread political fake news to Americans during the 2016 election and beyond. An article published on Election Day revealed that Macedonian spammers used fake profiles to spam Trump and Bernie Sanders groups to generate engagement for their often false pro-Donald Trump stories.

Rappler, an independent news website in the Philippines, last month published a story about a seemingly fake Facebook profile that spammed groups with links to websites that carried positive news about President Rodrigo Duterte.

Facebook groups are also an engine of misinformation in Myanmar. Thura, a worker with a Myanmar social media research organisation, told BuzzFeed News by email that “Facebook groups are being used widely to spread hatred.”

Spamming fake stories into groups is also the preferred tactic of American fake news publishers. Jestin Coler is a California man who ran more than a dozen early fake news sites such as National Report starting in 2013.

“Joining Facebook groups that are related to the content being promoted, then spamming links throughout those groups using various aliases, is quite effective,” he said. “Members of the group then essentially become ‘bots’ and often share content to their network (outside of the group) creating a more organic-looking campaign.”

Spamming groups is a global tactic on Facebook, and it often involves clickbait and misinformation. For example, in the summer of 2016, a network of sites sprung up that published false stories about bombings and terrorist attacks taking place in different cities. The stories were almost identical except that the location changed in each version.

Keyword squatting and targeted harassment

The group that changed its name to “March for Our Lives 2018 Official” in the wake of the school shooting in Parkland, Florida, is a perfect example. That group has over time gained tens of thousands of members thanks to the admins changing its title to different topical names that are likely to attract new members.

Thura said keyword squatting is a common tactic in Myanmar, too. People will start a Facebook group with a topic that has broad appeal and is not connected to news. Then, once it’s gained enough members, the admins switch its name and start sharing false stories.

Thura said he and other have found it’s often pointless to flag content or complain to group admins about fake news since they’re often the ones spreading it.

A marketplace of black hat groups services

There are multiple software apps that can automate posting links in groups and also automate the process of joining groups and inviting other profiles into them.

All of the aforementioned services violate Facebook’s terms of service, but they are easily found with a simple online search, or by going to a freelancing site such as Fiverr. Many of these services are offered by people based in countries such as Bangladesh, Pakistan, and India.

Radicalization by the recommendation engine

DiResta, the security researcher, first saw the risks of groups in 2015 while researching health conspiracy content and communities on social networks. She joined anti-vaccine parenting groups on Facebook and watched as well-meaning parents became increasingly radicalized in their views of Western medicine. She also saw false and misleading links spread quickly within groups.

DiResta also documented how group members coordinated the sharing of specific links on Twitter and other platforms to create the impression of an outpouring of support for a specific point of view. “They didn’t have bots but they were effectively replicating the function of bots,” DiResta said. “They were using these groups to coordinate and spread these messages.”

Facebook’s recommendation engine also began its own process of algorithmic radicalization. DiResta noticed that as her account became more involved in anti-vax groups Facebook shifted the kind of groups it recommended to her. She was soon being shown groups about chemtrails, fluoride conspiracies, and flat Earth. As the 2016 election moved into its final stretch, Facebook suggested she join groups dedicated to the Pizzagate conspiracy theory. DiResta calls it “radicalization via the recommendation engine.” (conspiracy theory)

Facebook also uses data about groups membership to recommend new friends. At the same time that the groups recommendation engine can push people further to the fringe, Facebook will suggest new friends who reinforce these perspectives.

“People have a right to search for Pizzagate and join Pizzagate groups, yes,” she said. “But for Facebook to be continuously, proactively surfacing these types of groups to people, that I think is where the ethics of the recommender system is insufficiently developed right now.”

DiResta said Facebook needs to recognize groups are “a potential radicalization pathway.”


Edited:    |       |    Search Twitter for discussion