Facebook CEO Mark Zuckerberg to ‘Revisit’ Content Moderation Policies
Facebook CEO Mark Zuckerberg is offering an olive branch to employees upset over his refusal to rein in President Trump’s controversial posts about the George Floyd protests.
On Friday, Zuckerberg published a lengthy 1370-word post that basically says the social network is going to revisit the company’s policies around content moderation. Specifically, the review will focus on posts concerning “excessive use of police or state force,” voting during a pandemic, and whether Facebook should try other ways to rein in rule-breaking content like placing a warning label over it.
Facebook has faced backlash for its decision to leave up a post from President Trump because it didn’t violate the social media company’s policy about inciting violence.
“As we continue to process this difficult moment, I want to acknowledge the real pain expressed by members of our community. I also want to acknowledge that the decision I made last week has left many of you angry, disappointed and hurt. So, I am especially grateful that, despite your heartfelt disagreement, you remain focused on taking positive steps to move forward. That can’t be easy, so I just want to say I hear you and I’m grateful.”
He addressed seven areas to be examined after gathering feedback from employees, civil rights experts and internal subject matter experts, often leading with the phrase, “We’re going to review,” and warning that changes may not be made in all of these areas.
Zuckerberg’s was less a commitment to any real action, but made several promises that the Facebook CEO said could help “to heal the divisions in our society.” In recent years, critics have railed on Facebook for its algorithmic spreading of misinformation, abetting genocide, and failure to protect its users personal information.
Last week, Trump came under fire for posting on Facebook that “when the looting starts, the shooting starts,” words that were deemed appropriate by the company and its CEO to leave on its platform. On Friday, Zuckerberg said Facebook will review situations around content that discusses “excessive use of police or state force” or occurs in countries that have “ongoing civil unrest or violent conflicts.”