Yesterday afternoon, former Facebook data scientist Sophie Zhang testified to the Joint Committee on the Draft Online Safety Bill (you can watch the full testimony here).
This comes after she spoke out in April against her former employer.
Zhang’s allegations strongly reinforced what Frances Haugen said before a U.S. Senate subcommittee last week:
Facebook is knowingly allowing the propagation of large-scale disinformation campaigns because, to put it simply, executives can’t stomach even the smallest hit to their bottom line.
Here is an overview of the situation to date with Facebook. Their role in choosing profits over human safety, and propagating targeted hateful, misleading and abusive content has undermined democracy and led to real events with real consequences – the insurrection on January 6th, for example.
Zhang, whose former role involved dealing with bots and fake accounts, gave Members of Parliament a good idea of how severe the problems are.
She claimed to have direct experience with bringing issues to the attention of executives and being shot down. When she brought concerns about fake Facebook accounts manipulating elections in Honduras, she claims the company failed to “agree on the importance” of the dilemma.
“The people charged with making important decisions about what the rules are and how the rules are getting enforced are the same as those charged with keeping good relationships with local politicians and governmental members”, she explained.
She also gave an illuminating perspective on how the Draft Online Safety Bill should contend with giant tech platforms like Facebook.
Zhang testified that allowing social media companies to conduct their own internal risk assessments would likely lead to them “pretend[ing] that the problem doesn’t exist”, and thus not reporting to Ofcom.
Instead of the self-reporting model, Zhang advocated for Ofcom and external regulators and digital experts to test and assess the platforms’ efficacy in keeping users safe from misinformation and abuse.
Facebook is making little to no progress towards a consistent policy on dealing with the safety of its users.
Thanks to whistleblowers like Haugen and Zhang, the public is finally beginning to get the full picture. In what many are calling Facebook’s “Big Tobacco Moment“, citizens and government officials across the political spectrum are beginning to understand the importance of comprehensive reform to the digital space.
It’s critical that the Online Safety Bill – as well as other similar attempts at digital regulation globally – gets it right. We can’t allow platforms like Facebook to pull at the social fabric of nations around the world for much longer – the consequences will only get more severe.