The mandate letters for the Minister of Canadian Heritage and for the Minister of Justice and Attorney General each contain a section calling for the Ministers “to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host”.
It is worth examining the report of the Joint Committee on the Draft Online Safety Bill, issued by the UK House of Lords and House of Commons in mid-December [pdf, 2.0 MB].
The UK report opens powerfully:
Self-regulation of online services has failed. Whilst the online world has revolutionised our lives and created many benefits, underlying systems designed to service business models based on data harvesting and microtargeted advertising shape the way we experience it. Algorithms, invisible to the public, decide what we see, hear and experience. For some service providers this means valuing the engagement of users at all costs, regardless of what holds their attention. This can result in amplifying the false over the true, the extreme over the considered, and the harmful over the benign.
The human cost can be counted in mass murder in Myanmar, in intensive care beds full of unvaccinated Covid-19 patients, in insurrection at the US Capitol, and in teenagers sent down rabbit holes of content promoting self-harm, eating disorders and suicide. This has happened because for too long the major online service providers have been allowed to regard themselves as neutral platforms which are not responsible for the content that is created and shared by their users. Yet it is these algorithms which have enabled behaviours which would be challenged by the law in the physical world to thrive on the internet. If we do nothing these problems will only get worse. Our children will pay the heaviest price. That is why the driving force behind the Online Safety Bill is the belief that these companies must be held liable for the systems they have created to make money for themselves.
At 193 pages, it is a lengthy but worthwhile read. The report includes 127 recommendations for the draft legislation over 25 pages in the report.
Having been involved in a number of projects and committees with the late human rights lawyer Alan Borovoy, I am sympathetic to his view that “We should not censor those making racist statements but censure them.” Professor Daniel Lyons of Boston College writes “Traditionally, the solution to bad speech is more speech, not censorship.” He expresses concern that attempting to control the flow of information online, like other attempts to craft digital legislation, can lead to unintended consequences.
Big Tech’s critics are correct that platforms can be misused to disseminate socially undesirable content online. But the remedy is to address the source of the problem — harmful users and the social conditions that make their messages attractive to others.
Still, it is difficult to reconcile how we can accept content in a digital stream which would be unacceptable in print, or as the UK paper writes, enabling “behaviours which would be challenged by the law in the physical world to thrive on the internet.”
Following its consultation process, those crafting new laws need to maintain a careful balance. Perhaps those involved in drafting Canada’s legislation should consider some of the UK recommendations in their initial draft.
“If we do nothing these problems will only get worse. Our children will pay the heaviest price.”