Online platform accountability

Online platform accountability is a significant piece of Canada’s proposed online harms legislation, Bill C-63.

Much has been written about the Online Harms Act already. Observers note there are multiple distinct parts found in the legislation: Internet platform regulation; the return of Section 13 of the Canada Human Rights Act; and, perhaps the most controversial, inclusion of Criminal Code provisions.

As I have written before, my own thinking has been heavily influenced by the late Alan Borovoy, a great Canadian civil rights lawyer, who used to say we should censure, not censor, those who spew hate speech.

I ran across a relevant article from the International Center for Law and Economics (ICLE) to contribute to the discussion. Two and a half years ago, “Who Moderates the Moderators?: A Law & Economics Approach to Holding Online Platforms Accountable Without Destroying the Internet” was written by Geoffrey A. Manne, Kristian Stout, and Ben Sperry in the context of Section 230 of the US Communications Decency Act of 1996.

To the extent that the current legal regime permits social harms online that exceed concomitant benefits, it should be reformed to deter those harms if such reform can be accomplished at sufficiently low cost. The salient objection to Section 230 reform is not one of principle, but of practicality: are there effective reforms that would address the identified harms without destroying (or excessively damaging) the vibrant Internet ecosystem by imposing punishing, open-ended legal liability? We believe there are.

A common set of objections to Section 230 reform has grown out of legitimate concerns that the economic and speech gains that have accompanied the rise of the Internet over the last three decades would be undermined or reversed if Section 230’s liability shield were weakened. Our paper thus establishes a proper framework for evaluating online intermediary liability and evaluates the implications of the common objections to Section 230 reform within that context. Indeed, it is important to take those criticisms seriously, as they highlight many of the pitfalls that could attend imprudent reforms.

The ICLE authors assert that there are actual harms — violations of civil law and civil rights, violations of criminal law, and tortious conduct — occurring on online platforms. These impose real costs on individuals and society at-large and therefore, justify establishment of a means to apply a measure of order and accountability to the online world.

According to ICLE, intermediary liability, applied to the platforms, can be a cost effective approach for online platform accountability. It notes that “the fundamental principles that determine the dividing line between actionable and illegal or tortious content offline can and should be respected online, as well”.

I found the executive summary and the full paper [pdf, 1MB] worth the time to review.

Still, as I wrote on Twitter (X) a few weeks ago, I remain concerned that Canada’s Online Harms Act may not address the root cause of harms online. The Act may indeed reduce the number of despicable posts we see on digital platforms, but it will do nothing to reduce the number of despicable people who hold those despicable points of view. It will just be harder to find them.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top