The Online Safety Bill returned to Parliament on 5 December after a delay of five months caused in no small part by the turmoil in the Conservative party. While the revived Bill comes with significant and material amendments it will make no rapid progress through Parliament, instead being sent back to committee stage for further scrutiny.
Amongst the new amendments there are three that stand out of significance to communication service providers at whom, in a regulatory context, the Bill is largely aimed.
First, providers will not be required to make a judgment call in relation to content that might be considered "harmful" but is not illegal. A lack of certainty and the inevitable subjectivity involved in assessing potentially harmful content, together with concerns over the incursion into freedom of expression that might have, has precipitated a tightening of the draft so providers will only be required to remove content that is illegal. For some this will be regarded as a watering down of one of the Bill's main aims, but it will likely make regulatory compliance less onerous for providers.
However, secondly, the quid pro quo is that providers will owe duties of care to adult users and must provide them with greater information and control over content they may wish to access and the necessary mechanisms to shield themselves from content that they wish to avoid. Further, in what is something of a consequential volte face as a result, providers will not be able to remove or restrict legal content, or ban or restrict a user, unless the circumstances for doing so are clearly set out in their terms of service or are against the law. This is a major shift in favour of freedom of expression in the light of a concern that an unintended consequence of providers erring on the side of caution in assessing "harmful" content would lead to a disproportionate curtailment of free speech by the removal of lawful content.
Together, these first two changes amount to what the government has termed the "triple shield of protection" for users of online platforms. The obligation is placed squarely on the providers to remove illegal content, take down material in breach of their own terms of service, and provide adults with greater knowledge and choice over what they view.
Thirdly, there are further protections for children. Providers will be required to publish their risk assessments of any dangers the site's content might pose to children, and a further amendment will make providers’ responsibilities to provide age appropriate protections for children clearer. Where a minimum age for users is required, providers' terms of service will need to explain the measures they used to enforce this, such as age verification technology.
It is intended that the Bill should be enacted by May 2023. That date may of course be delayed further still, but there can be no doubt that the level of regulation and compliance the eventual Act will require of communication service providers will mean that companies would be well advised to keep abreast of how the law will impact their business practice.