For weeks, Facebook has been shaken by revelations that have ignited a firestorm of criticism from lawmakers, regulators and the public.
Reports by The Wall Street Journal from research documents provided by a whistle-blower put Facebook under a microscope. Those reports showed how Facebook knew Instagram was worsening body image issues among teenagers, among other issues.
The whistle-blower goes public.
The whistle-blower, Frances Haugen, went public during an interview on “60 Minutes” in early October. On Oct. 5, Ms. Haugen testified before a Senate subcommittee for more than three hours. She said Facebook had purposely hidden disturbing research about how teenagers felt worse about themselves after using its products and how it was willing to use hateful content on its site to keep users coming back. In her testimony, she encouraged lawmakers to demand more documents and internal research, suggesting the documents she had provided were just the tip of the iceberg.
After Ms. Haugen testified, executives publicly questioned her credibility and called her accusations untrue. But internally, they tried to position their stances to hang on to the good will of more than 63,000 employees and assuage their concerns.
Leaked documents reveal internal struggles.
Reporters have since covered more internal documents from the company, which owns Instagram and WhatsApp in addition to the core Facebook social network. Documents about Instagram, for instance, reveal a company that is struggling with retaining, engaging and attracting young users.
Other documents raise questions about Facebook’s role in election misinformation and the pro-Trump attack on the Capitol on Jan. 6. Company documents show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarize American voters before the election. Employees believed Facebook could have done more, according to the documents.
The challenges are global.
In India, Facebook’s biggest market, the problems are bigger, too. Internal documents show a struggle with misinformation, hate speech and celebrations of violence. Dozens of studies and memos written by Facebook employees provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential impact on local culture and politics, and fails to deploy the resources to act on issues once they occur.
The latest revelations, published on Monday morning, show internal research that undercuts the heart of social networking — “likes” and sharing — that Facebook revolutionized. According to the documents, researchers determined over and over that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.
Without government-mandated transparency, Facebook can present a false picture of its efforts to address hate speech and other extreme content,Ms. Haugen told Britain’s Parliament. The company says artificial intelligence software catches more than 90 percent of hate speech, but Ms. Haugen said the number was less than 5 percent.
“They are very good at dancing with data,” she said.