Final Report 2022: How to Make Online Platforms More Transparent and Accountable to Canadian Users

Free speech is a fundamental hallmark of any democratic society. While social media platforms have allowed more people to participate in public discussions and debates, we cannot ignore the very real harms caused by the rise in disinformation, conspiracy theories and hateful rhetoric on these platforms and how they have negatively impacted democratic and free expression in Canada. 

Canada isn’t alone in grappling with these issues, as democratic governments around the world are proposing their own policy solutions. Canada has an opportunity to be a global leader in this space if it builds upon focusing on the platform systems as a whole instead of merely reacting to individual pieces of bad content or bad actors. 

To achieve this, the Canadian Commission on Democratic Expression’s 2021-22 annual report makes a series of recommendations to increase platform transparency, accountability, and empowerment.

Transparency 

Platforms use our personal data and then use algorithms and other proprietary tools that impact our public discourse – often by targeting users with divisive and harmful content – yet the platforms processes remain overtly opaque and hidden from the public. Platforms need to be much more transparent in how our data is collected and shared with third parties, as well as how algorithms use that data to promote and amplify content back to us. 

The recommendations in this area were designed to radically increase transparency in order to strengthen oversight while also equipping the public, researchers and journalists with tools to properly highlight and address some of the structural issues that lead to the amplification of harmful content.

Accountability 

Transparency in and of itself is insufficient if there are no accountability mechanisms. In order to hold platforms accountable, we can draw upon a legal model already in widespread use in Canadian law – a statutory duty to act responsibly.

Currently, platforms do not have to consider the harmful effects of their products and are free to distribute and algorithmically promote content solely on the basis of financial motives. There is a conflict here between the financial interests of these platforms and the larger public good, including the safeguarding of our democracy and our democratic rights.

Subjecting platforms to the duty to act responsibly would place the onus on platforms to demonstrate that they have conducted risk assessments with respect to harm and have taken steps to minimize the harm of their products. 

All other consumer facing products need to consider the risks of their products and demonstrate that steps have been taken to mitigate those risks. Platforms should not be exempt from this.

Given the scale of the issue (hundreds of billions of pieces of content) as well as the need to ensure people’s right to democratic expression is maintained, the recommendations in this area focused on developing an approach that focuses on the platforms’ systems and incentives as a whole, instead of merely reacting to problematic content once it is already online.

Empowerment

Platform users are currently subject to a large structural imbalance of power, in order to address this, we must ensure people are empowered to manage and control their online presence. This means significantly strengthening user rights in terms of data privacy protection, mandating interoperability and data portability, and embarking on a comprehensive civic and digital literacy program.

The recommendations in this area aim to redress structural power imbalances and inequalities that are often amplified by the platform systems.

Full List of Recommendations

Transparency 

1. Create a new independent federal regulator with investigative, auditing, and enforcement powers to ensure the duty to act responsibly is applied to platforms.

  • The regulator would be formally independent from the government and would report to Parliament at regular intervals

  • This new regulator would investigate perceived harms, assess platform liability, and determine and enforce remedies when platform liability is established.

2. Implement separate tiers of data access for the public, researchers, journalists and civil society groups

  • There would be three graduated tiers of data sharing: the general public, accredited researchers and journalists, and more specialized and detailed research in the public interest (requiring significant additional safeguards to protect privacy)

3. Mandate social media platforms to be transparent about their advertising

  • Social media platforms would be required to disclose and regularly archive (in a standardized machine-readable format with minimum disclosure standards) specific information about every digital advertisement and paid content post

4. Codify and enshrine stronger protections for whistleblowers

  • Current or former employees who expose corporate malpractice need protections against legal, economic and reputational retaliation by their employers

Accountability

5. Create a new independent federal regulator with investigative, auditing, and enforcement powers to ensure the duty to act responsibly is applied to platforms.

  • The regulator would be formally independent from the government and would report to Parliament at regular intervals

  • This new regulator would investigate perceived harms, assess platform liability, and determine and enforce remedies when platform liability is established.

 6. Ensure existing regulators are properly empowered and equipped to operate in the 21st century digital world.

  • Regulators that enforce competition policy as well as privacy regulations currently do not have the resources and authority needed to operate effectively in our digital world. In addition to more qualified personnel (data scientists, AI specialists, etc) they need to be able to impose remedies on the platforms and have the authority and flexibility to communicate with one other to ensure greater inter-agency cooperation

7. While all platforms have a duty to act responsibility, there should be tiered obligations based on the size and scope of the platforms including platforms that are more likely to be accessed by minors

  • Matching obligations to the size of platforms would ensure greater compliance, while also promoting an environment that isn’t hostile to competition and innovation

  • Adapting platform obligations based on the age of its users ensures children are better protected online and would also mean Canada is adhering to internationally established safeguards for children

8. Legislate intermediary liability protections and exceptions for platform liability

  • Canada is the only G7 country without comprehensive intermediary liability laws in place. The federal government should introduce legislation that incorporates intermediary liability protections that are consistent with its trade obligations under the 2020 Canada-United States-Mexico Agreement (CUSMA)

  • This legislation would clarify when platforms can and can’t be held liable for harms arising from content posted on the platforms by users

9. Regulatory entities should develop and implement a rights-based algorithmic accountability framework which includes algorithmic impact assessments (AIAs), human rights impact assessments (HRIAs) and algorithmic audits

  • AIAs encourages identifying and mitigating the potential risks of algorithmic systems. AIAs should be conducted on a regular basis.

  • HRIAs would ensure platforms are undertaking a regular examination of the impact of their products on human rights

10. Develop a Code of Practice on Disinformation

  • Fighting disinformation must be a shared responsibility and goal between public institutions and platforms

  • The overall aim of the Code would be to create platform-based policies and procedures that would address disinformation and would do so in a way that is efficient and flexible

Empowerment

11. Support Indigenous knowledge, relationships and protocol development and Indigenous data governance for Indigenous communities.

  • Ensuring meaningful participation of Indigenous people’s representatives will safeguard Indigenous people’s interests in addressing online democratic issues and inform broader collective values of reciprocity and self-determination

12. Substantially strengthen civic education respecting rights, digital literacy and access to quality information to support equity-seeking groups and community led programs

  • Public education and digital literacy initiatives should provide the public with an understanding of their rights and freedoms, how digital media works and its impact on public opinion, and how structural biases operate within the digital framework

  • Underrepresented groups should be supported through targeted policies as should children and their guardians

13. Mandate interoperability and data mobility

  • Canada should ensure the interoperability of digital services to empower individuals with greater control and choice in their interactions online

  • Canada should also ensure individuals have the right to have their personal data transmitted directly from one platform to another

14. Modernize Canada’s privacy legislation

  • Canada’s privacy regime for the private sector does not reflect the current realities of a digital world

  • Canada should update its legislation to a rights-based framework for current and future technological developments

  • The Privacy Commissioner of Canada should be given authority to modernize Canada’s current privacy framework including how private platforms can collect, process and target individual user data

Read the final report of the 2021-22 Canadian Commission on Democratic Expression:

 

2021-22 Commissioners


The Right Honourable Beverley McLachlin, PC, CC, Commission Co-Chair

Taylor Owen, Beaverbrook Chair in Media, Ethics and Communications and Associate Professor, Max Bell School of Public Policy, McGill University, Commission Co-Chair

Rick Anderson, Principal, Earnscliffe Strategy Group

Wendy Chun, Canada 150 Research Chair in New Media, Simon Fraser University

Nathalie Des Rosiers, Principal, Massey College, Full Professor, Faculty of Law (Common Law) University of Ottawa, Distinguished Visitor, Faculty of Law, University of Toronto

Amira Elghawaby, Director of Programming and Outreach, Canadian Race Relations Foundation

Merelda Fiddler-Potter, Vanier Scholar and PhD. Candidate, Johnson Shoyama Graduate School of Public Policy

Philip N. Howard, Director, Programme on Democracy and Technology and Professor of Internet Studies, Balliol College, University of Oxford

Vivek Krishnamurthy, Samuelson-Glushko Professor of Law at the University of Ottawa


2021 -2022 Expert Briefings:

Rebekah Tromble, Director of the Institute for Data, Democracy & Politics, Associate Professor School of Media & Public Affairs, George Washington University 

J. Nathan Matias, Assistant Professor, Cornell University Department of Communication and Founder of the Citizens and Technology Lab

Laura Edelson, Ph.D. Candidate in Computer Science, NYU Tandon School of Engineering

Ethan Zuckerman, Associate Professor of Public Policy, Communication, and Information at the University of Massachusetts at Amherst and Founder of the Institute for Digital Public Infrastructure

Seeta Peña Gangadharan, Associate Professor at the Department of Media and Communications at the London School of Economics (LSE)

Laura Murphy, Civil Liberties and Civil Rights Leader, Policy Strategist

Kate Klonick, Assistant Professor of Law, St. John’s University Law School and Affiliate Fellow, Information Society Project, Yale Law School

Ravi Naik, Legal Director, AWO Agency

Emily Laidlaw, Canada Research Chair in Cybersecurity Law and Associate Professor at the University of Calgary, Faculty of Law

Andrew Strait, Associate Director, Ada Lovelace Institute

Divij Joshi, Doctoral Researcher, University College London

Jennifer Wemigwans, Assistant Professor, University of Toronto

Marietje Schaake, International Policy Director, Stanford Cyber Policy Center

Meetali Jain, Deputy Director, Reset.tech

Mark Scott, Chief Technology Correspondent, POLITICO

Ryan Merkley, Managing Director, Aspen Digital, Aspen Institute

Previous
Previous

“The Johnny Depp-Amber Heard trial provides lessons in tackling harmful content on social media” (Toronto Star)

Next
Next

“Whose speech will Elon Musk’s Twitter be protecting, exactly?” (Globe and Mail)