The Online Safety Bill is a mystery shrouded in a riddle within a riddle

0

There are few legislative initiatives as baffling as the UK government’s quest to enforce new security measures to protect users from all that is evil on the internet. Ensuring that illegal content is handled appropriately by platforms seems axiomatic, but creating a legal obligation to remove “legal, but harmful” content was always intended to provoke a clash between free expression purists and, for example, advocates for additional protection against cyber-harassment. The Online Safety Bill (OSB) has exposed both the best and the worst of cyber actors who have a stake in how social media platforms are regulated.

Getting Nadine Dorries to be the spokesperson for such a controversial bill might not have been the wisest tactical decision Boris Johnson made (watch MP for Mid Bedfordshire get busted for sending abusive tweets here), but when Johnson left 10 Downing Street, so did Dorries. New Culture Secretary Michelle Donelan has vowed to scrap ‘legal, but harmful’ provisions found in Dorries’ version of the OSB and replace them with new measures designed to put greater emphasis on freedom of expression. ‘expression.

Meanwhile, former DCMS chairman and Zuckerburg agitator Damian Collins has vowed not to remove them entirely, opting instead to leave them in a “mangled form”. With the bill due for third reading in the House of Commons on November 1, it was withdrawn from House of Commons business. And with Johnson leaving and Liz Truss coming in and then a hasty departure, the mess has now landed on Rushi Sunak’s desk.

The new Prime Minister has the unenviable task of deciding whether freedom of speech and all its consequences are worth restricting in order to protect children from the abuse they are likely to face via social media platforms, while ministers suggest continued delays could force the government to abandon the legislation entirely. Even the House of Lords is getting involved: to ensure it can get through this session, the Lord Speaker has allowed an urgent question on the OSB’s future, following the report on preventing future deaths written at the end of the inquest into the death of Molly Russel.

The OSB’s continued delays have left almost everyone perplexed. Although the bill was heavily criticized for its disregard for free speech rights, it enjoyed the political support of almost every major politician. Only Johnson’s implosion and subsequent removal from 10 Downing Street succeeded in derailing the imposition of a wide range of obligations: due diligence, accountability measures, transparency rules and oversight of the platform by the Ofcom.

This derailment has created such a vacuum of criticism that the BSF might actually be put aside for another day. Even if it does make it through parliament, we are a long way from any real understanding of its implementation: the 218-page bill will need to be fleshed out with secondary legislation, codes of practice, Ofcom guidelines and designated categories of service providers. – each with their own step-by-step procedures. All of this will be necessary to understand how to impose a legal obligation on platforms to make police speeches that can cause harm.

The legal requirement to remove “lawful, but harmful” content, along with a host of new criminal offences, remains at the heart of the controversy surrounding the bill. As tech lawyer Graham Smith rightly points out, this new offense creates a veto for those who are afflicted by views they consider repugnant, even if that reaction is unreasonable. The risk of paralyzing legal discourse is exacerbated when the offense is combined with the duty of illegality that the bill, in its current form, would impose on platforms and search engines.

Despite many laws already in place to address some of the identified harms and many laws regulating content, actions and behavior, the BSF is shifting the government’s own policing responsibilities to the platforms; in other words, “it’s your platform, so you take care of it”. OSB’s philosophy is simple: platforms are no different from theme parks, offices and restaurants; therefore, as platforms are places where people congregate, imposing due diligence will work well between platforms and users there as well.

Risk-based legal regimes such as the UK Health and Safety Act have already successfully deployed a duty of care. But this regime creates duties of care on physical environments. The cost of slipping on wet floors can be factored into everything from warning signs to commercial insurance. In the online world, it’s different. Once something is priced, imposing a duty of care establishes a transaction cost for the content. Speech deemed too costly for the platform will be filtered, blocked and/or deleted ex ante rather than ex post, particularly when the uncertainty surrounding the content is deemed to have too high a transaction cost, regardless of its actual risk. In other areas, the imposition of a duty of care mitigates the costs of distributing uncertainty through legal conventions. It remains unclear how a law can protect freedom of expression when some of that expression will remain risky for some of those who may encounter it.

The online safety bill remains a mystery wrapped in a riddle within a riddle.

Dr. Mark Leiser is Professor of Technology Law at Vrije University Amsterdam. To learn more about the changing world of data-driven advertising and marketing, check out our latest deep dive.


Source link

Share.

Comments are closed.