Radio Free never accepts money from corporations, governments or billionaires – keeping the focus on supporting independent media for people, not profits. Since 2010, Radio Free has supported the work of thousands of independent journalists, learn more about how your donation helps improve journalism for everyone.

Make a monthly donation of any amount to support independent media.





Facebook's Content-Moderation Policies Are a Hot Mess

I confess: I still have a Facebook account.  This might seem unusual for someone who works on curbing the spread of hate speech online.  

Much of my feed consists of puppies, babies and posts from my local Buy Nothing Group. Your Facebook feed probably looks a lot different than mine. We have different friends, different likes, different group affiliations and different interests. As a result, Facebook’s algorithms deliver an experience unique to each of us, specifically designed to keep us hooked on the platform. On Facebook, none of us share the same reality.

Facebook has long deployed algorithms that amplify and spread these organic posts, ads, events, groups and pages that endanger people’s lives.

Just because some of us don’t see hateful posts in our newsfeeds doesn’t diminish the impact they have on those who do. Facebook has long deployed algorithms that amplify and spread these organic posts, ads, events, groups and pages that endanger people’s lives.

The proliferation of divisive and hateful content is baked into Facebook’s business model: The platform profits from finely targeting ads to users who are most likely to respond to them.

Facebook’s own research shows that divisive content more deeply engages users. And a Facebook executive recently revealed that “right-wing populism is always more engaging” than centrist or progressive fare, with content that touches on topics such as “nation, protection, the other, anger, [and] fear.”

The guidelines of what is acceptable or banned on the platform are sprawled across a confusing labyrinth of pages that make up Facebook’s humongous corporate site. You’ll need to look at all of the Community Standards, Terms of Service, Newsroom and Help Center posts to begin to get a clear picture. And just when you think you’ve figured it out, Facebook announces another policy change or tweaks its standards.

By intentionally seeding this patchwork, Facebook creates the illusion of transparency. This tangle of conflicting policies and webpages — in addition to the patchwork of internal guidelines to content moderators — allows Facebook to systematically break its promise to keep people safe, allowing white supremacists and hate groups to use the platform to spread brutal ideologies, fundraise and organize events that incite violence.

We saw a tragic example of this just recently, when the event page of right-wing paramilitary group the Kenosha Guard invited its members to arrive armed at a racial-justice protest in Wisconsin. The event was flagged 455 times in a day and reviewed by four human moderators who determined it did not violate Facebook’s policies. Facebook took notice only after a 17-year-old crossed state lines and killed two people protesting the police shooting of James Blake. Mark Zuckerberg called Facebook’s failure to remove the page an “operational mistake.”

These types of “mistakes” are inexcusable — especially since the organization Muslim Advocates has been sounding alarms about event-page abuses since 2015. Facebook repeatedly ignored the group’s prescient warnings and repeatedly failed to take down pages promoting hateful rallies. And even in the wake of Kenosha, Facebook has yet to modify its events policies to clearly indicate that calls to arms are prohibited.

In an attempt to placate critics, Facebook has announced a slew of policy changes in recent weeks. These include a ban on ads that “praise, support or represent militarized social movements,” the addition of context to posts when a candidate or party prematurely declares victory, a ban on ads that delegitimize the election’s outcome, and a ban on political ads after Election Day until a winner is officially declared.

Facebook also banned all pages by QAnon and other violent groups across its platforms (but not the content itself). The company also updated its hate-speech policy, adding Holocaust denial to the list of prohibited content. A New York Times opinion piece by Charlie Warzel noted that these changes are a “tacit admission that what is good for Facebook is, on the whole, destabilizing for society

Civil-rights and racial-justice organizations, scholars, activists and journalists have long pressured Facebook to make these kinds of changes — and many others. And for its part, Facebook is spinning these updates as “progress,” even though they came too late — a month before the presidential election — to repair the significant damage inflicted throughout 2020. 

Clear action steps to mitigate hate online have been available to Facebook for years. In 2018, a coalition of groups, including Free Press, crafted Change the Terms — a set of model policies to curb hateful activities on online platforms. Among the many recommendations is that platforms enforce policies in a transparent, equitable and culturally relevant way.

Despite its recent changes and enforcement actions, Facebook still has a long way to go to meaningfully address its failure to protect our communities and our democracy. People’s lives are on the line.

Print
Print Share Comment Cite Upload Translate Updates

Leave a Reply

APA

Common Dreams | Radio Free (2020-10-28T13:10:22+00:00) Facebook's Content-Moderation Policies Are a Hot Mess. Retrieved from https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/

MLA
" » Facebook's Content-Moderation Policies Are a Hot Mess." Common Dreams | Radio Free - Wednesday October 28, 2020, https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/
HARVARD
Common Dreams | Radio Free Wednesday October 28, 2020 » Facebook's Content-Moderation Policies Are a Hot Mess., viewed ,<https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/>
VANCOUVER
Common Dreams | Radio Free - » Facebook's Content-Moderation Policies Are a Hot Mess. [Internet]. [Accessed ]. Available from: https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/
CHICAGO
" » Facebook's Content-Moderation Policies Are a Hot Mess." Common Dreams | Radio Free - Accessed . https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/
IEEE
" » Facebook's Content-Moderation Policies Are a Hot Mess." Common Dreams | Radio Free [Online]. Available: https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/. [Accessed: ]
rf:citation
» Facebook's Content-Moderation Policies Are a Hot Mess | Common Dreams | Radio Free | https://www.radiofree.org/2020/10/28/facebooks-content-moderation-policies-are-a-hot-mess/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.