Facebook’s executives reportedly resisted efforts to dial back features that help amplify false and inflammatory content ahead of the 2020 US election because they feared doing so could harm the platform’s usage and growth.
The Wall Street Journal, citing leaked internal documents, said Facebook’s employees suggested changes such as killing the reshare button or stop promoting reshared content unless it was from a user’s friend that could slow the spread of viral content for everyone. A proponent of making these types of changes has been Kang-Xing Jin, who heads Facebook’s health initiatives, according to the report. But executives such as John Hegeman, Facebook’s head of ads, raised concerns about stifling viral content.
«If we remove a small percentage of reshares from people’s inventory,» Hegeman wrote in internal communications cited by The Journal. «they decide to come back to Facebook less.»
The report is the latest in a series of leaked internal documents and communications that The Journal says shows Facebook has put its profits over the safety of its users. Frances Haugen, who used to work as a Facebook product manager, publicly identified herself as the whistleblower who gathered leaked documents used by The Journal. The findings from these internal documents has reignited scrutiny from US and UK lawmakers. Haugen, who already appeared before Congress, is scheduled to testify before the UK Parliament on Monday.
Facebook has repeatedly said its internal research and correspondence is being mischaracterized. «Provocative content has always spread easily among people. It’s an issue that cuts across technology, media, politics and all aspects of society, and when it harms people, we strive to take steps to address it on our platform through our products and policies,» a Facebook spokesman said in a statement.
The moderation of political content, though, has been a hot-button issue for the company as it tries to balance safety with concerns about hindering free speech. Conservatives have also accused Facebook of intentionally censoring their content, allegations the company denies.
Instead of making changes that would be less likely to raise alarms about free speech, Facebook’s approach to moderating content from groups that it considers dangerous has been described as a game of whack-a-mole by The Journal.
The New York Times, also citing internal documents, reported Friday that Facebook failed to address misinformation and inflammatory content before and after the 2020 US presidential election even though employees had raised red flags about the issue.
Supporters of Donald Trump, who lost the presidential election to Joe Biden, were posting false claims that the election had been stolen. Facebook has suspended Trump from its platform until at least 2023 because of concerns his comments could incite violence following the deadly US Capitol Hill riot in January.
One Facebook data scientist found that 10 percent of all US views of political content were of posts that alleged the vote was fraudulent, according to The Times. Facebook employees also felt like the company could have done more to crack down on misinformation and conspiracy theories.
A Facebook spokesperson said the company spent more than two years preparing for the 2020 election and more than 40,000 people now work on safety and security. The company adjusted some of its measures before, during and after the election following more information from law enforcement. «It is wrong to claim that these steps were the reason for January 6th — the measures we did need remained in place well into February, and some like not recommending new, civic, or political groups remain in place to this day,» Facebook said.
The Times story is part of a series expected from an international group of news organizations that also received documents Haugen, according to The Information. More stories are expected next week, when Facebook reports earnings and holds its Connect conference on artificial and virtual reality.