“How to Fix Facebook” (The Decibel)

Taylor Owen
October 6, 2021
Summary by Yasmeen Safaie

 
 

Photo Sources: WSJ, Council on Foreign Relations

 

On Sunday, October 3rd, Frances Haugen, a former Facebook product manager working on civic integrity issues at the company, revealed that she was the source of the recent leak of thousands of internal documents from Facebook outlining ways they prevented safeguards on Instagram and Facebook around the spread of toxic content for teenage users as well as the spread of mis/disinformation during the Capitol attack. Haugen’s testimony before the U.S. Senate hearing on Tuesday followed the 6-hour long period on Monday when Facebook, Instagram, and WhatsApp – all platforms owned by Facebook – were down across the world.

The files, released to the Wall Street Journal and part of a multi-part investigation “The Facebook Files”, outline the many harms Facebook has left unmitigated despite internal research pointing to the problems. These include:

  • Knowledge that Instagram circulates toxic content harming the mental health of many, especially teenage women;

  • Research pointing to the harmful effects of the algorithms Facebook implemented promoting hateful content online; and

  • Reports about the use of Facebook for human trafficking, hate crimes, drug trafficking, among other harms.


In light of Haugen’s testimony, Centre Director Taylor Owen, spoke with The Globe and Mail’s The Decibel, to discuss what kinds of regulations governments should implement to prevent Facebook from perpetuating these harms.

Taylor reflects that Haugen’s testimony revealed the core of the problem with platforms such as Facebook, and a truth that we must face with clarity: the financial incentives of these companies are misaligned with the public good and therefore these companies will always choose growth and engagement over minimizing harms because it is in the interest of their shareholders.

The Facebook leak also revealed two new pieces of information which Taylor says, changes the current discourse about platform governance. First, the discussion around harms of platforms such as Facebook have been about political movements and the trade-offs between social media as a place for mobilization of advocacy movements such as Black Lives Matter versus a place for mobilization of movements perpetuating misinformation and conspiracy such as QAnon. But what Haugen’s testimony reveals, is that trade-offs have also been made about boosting engagement versus minimizing harm against children. Second, Taylor says that now we know just how much Facebook – a company that is unique in that it carries out its own research – knew about the harms its algorithms perpetuate, but chose not to act on that research.

So what should governments do to intervene, if anything? Taylor echoes Haugen’s suggestion against breaking up Facebook and says that policies increasing accountability and transparency may be more impactful in the immediate future. Before focusing on regulations around free speech, governments, such as Canada’s, should focus on policies could impose mandates around algorithm audits, data sharing with regulatory bodies, and mandating harm assessments, required of other sectors such as the pharmaceutical sector.

Highlights

  • Haugen cuts to the core of the problem: the financial incentives of these companies are misaligned with the public good;

  • When Facebook and other large platforms have to choose between growth and engagement and minimizing harm for the public good, they choose growth;

  • Much of the current discourse has concerned the wellbeing of adults and has been about weighing the trade-offs between allowing social media platforms to be a space for mobilizing political movements, whether they advocate for social justice or promote hate speech; now we are looking at the harms of a company against children;

  • We now know that Facebook knew the harms it was perpetuating and did not act on these findings. Facebook did comparative research between its own impact versus that of its competitors such as Snapchat and TikTok – seen more as performative rather than a literal representation of peoples’ lives – and found that it affected the mental health of users, particularly teenagers;

  • Facebook – and the algorithms programmed by its employees – knows that what engages us the most is often content that makes us angry prompting us to hate share, comment; oftentimes this content counters democratic ideals;

  • Facebook is hard to regulate because we don’t have access to all the data and also because it has empowered so many, given many disenfranchised people a voice, and has enabled global communication, which has blinded the costs of this platform;

  • Breaking up these companies through competition policy could make a difference in the long run; but immediately, many policies could make Facebook more transparent and accountable, as suggested by Haugen;

  • Increasing transparency could look like: sharing data with researchers, sharing data and access with regulatory bodies (either Congress or new internet regulators), auditing algorithms, mandating harm assessments, fines and penalties to inaction; typically done in other sectors like pharmaceutical companies like Pfizer.

  • Canada can benefit from a shift from focusing on policies governing speech online to greater focus on developing policy to increase transparency and accountability of these platforms and companies.

Listen to the full interview at the link below.

 
 
 
Previous
Previous

Announcing the Second Year of the Canadian Commission on Democratic Expression

Next
Next

Submission for the Federal Government’s 2021 Proposed Approach to Address Harmful Content Online