Communications Minister Richard Bruton today announced that he will introduce a new Online Safety Act to improve online safety and ensure that children can be protected online.
In a speech to students, parents and teachers at St. Brigids Girls National School in Glasnevin, Minister Bruton said:
Digital technology is transforming the world in which we work and live and learn. This provides huge opportunities for us all. It has been central to our economic and social development as a country for three decades. However, the digital world also presents new risks which did not exist previously.
As Minister for Education and Skills, I recognised the need to take action in this area, and issued a direction to schools that they should consult with parents, teachers and students on the use, if any, of smart phones in schools. “The situation at present where online and social media companies are not subject to any oversight or regulation by the state for the content which is shared on their platforms is no longer sustainable. I believe that the era of self regulation in this area is over and a new Online Safety Act is necessary.
Many parents find it difficult to keep up with the latest technology, or the latest app. That is understandable given how quickly online games and technology can evolve. To me it emphasises why the establishment of an Online Safety Commissioner is so important.
While it would be impossible to remove every danger from the internet or from the adaptation of new technology, what we need to do is to ensure that parents and children are better equipped, that the state can provide regulation and enforcement, and that online platforms take responsibility.
I will bring forward an Online Safety Act which sets out how we can ensure children are safe online. This will involve, for the first time, setting a clear expectation for service providers to take reasonable steps to ensure the safety of the users of their service. A Regulator, an Online Safety Commissioner, would oversee the new system.
Today I am putting forward a number of options for how a new Online Safety Commissioner can do this important work.
Under an Online Safety Act, changes would be introduced to:
- Protect Irish residents using online platforms with appropriate provisions
- Apply European Law which is designed to apply to a narrower range of services (e.g. video sharing platform services, on demand audiovisual media services, traditional TV) EU wide
New online safety laws to apply to Irish residents
Today the Minister proposed the categories of harmful online content that need to be targetted under his plan: “many of the proposals to date have not defined harmful content. The danger of not providing a clear definition is that we would unintentionally restrict legitimate freedom of speech and freedom of expression, which are core values. I believe the following areas are clear examples of what can be considered harmful:
- Serious Cyber bullying, including content which is seriously threatening, seriously intimidating, seriously harassing or seriously humiliating
- Material which promotes self-harm or suicide
- Material designed to encourage prolonged nutritional deprivation that would have the effect of exposing a person to risk of death or endangering health”
Online platforms are already required to remove content which it is a criminal offence under Irish and EU law to disseminate, such as material containing incitement to violence or hatred, content containing public provocation to commit a terrorist offence, offences concerning child sexual abuse material or concerning racism and xenophobia. The Minister said: “while the Gardaí will continue to be responsible for investigating and prosecuting these offences, it is important that platforms take preventative steps to protect victims of such offences.”
The Minister said that an Online Safety Act would place new requirements on operators to:
- Operate an Online Safety Code, which would set out the steps they are taking to keep their users safe online
- Include in their code a number of issues at a minimum (e.g. a prohibition of cyber bullying material; provide a complaints procedure where people can request material be taken down, with timelines)
- Build safety into the design of online platforms through the application of technology and human intervention
The Minister said that in passing an Online Safety Act it was important to set out clearly the powers and role of the Online Safety Commissioner. It is proposed that a number of powers could be provided to the Commissioner, including to:
- Certify that each Online Safety Code is either “fit for purpose” or require changes to it.
- Require regular reports from industry on a range of issues including content moderation, review, adjudication of appeals etc.
- Review the measures which a service has in place or to review the company’s content moderation teams as they are operating
- Require a service to remove an individual piece of content within a set timeframe, on receipt of an appeal from a user who is dissatisfied with the response they have received to a complaint submitted to the service provider, following an ajudication by the Online Safety Commissioner
- To issue interim and final notices to services in relation to failures of compliance and the power to seek Court injunctions to enforce the notices of the regulator
- To impose administrative fines in relation to failures of compliance
- To publish the fact that a service has failed to comply or cooperate with the regulator
- To seek that criminal proceedings be brought against the service provider (i.e. where an offence is created of a service provider not cooperating with the regulator, e.g. by failing to put measures in place, by failing to provide information to the regulator, that the regulator would have the power)
The Minister said:
It is important that government works with industry and the community to ensure online safety of children is paramount. It is very encouraging that in Australia, there is a 100% compliance rate with the Australian e-Safety Commissioner.
While there are many very good initatives going on across the government to promote online safety, particularily by WebWise, the Online Safety Commissioner can be a single online access point through which all available Online Safety resources can be accessed by parents, teachers and children. This could build on the government’s Be Safe Online portal.
The Best Way to Regulate
The Minister also said that there are two ways in which an Online Safety Commissioner can be established, including:
- A Media Commission: establish a new Media Commission by restructuring the Broadcasting Authority of Ireland, along the lines of the multi-Commissioner Competition and Consumer Protection Commission. Establish the Online Safety Commissioner as a powerful office within that structure.
- Two Regulators: Two regulatory bodies, one of which would involve restructuring the BAI and assigning it responsibility for content which is subject to editorial control (traditional television and radio broadcasting and on-demand audiovisual media services). The second online safety regulator would be a new body responsible for online content that is not subject to editorial controls (such as social media and video sharing platforms etc)
Regulation of Video Sharing, On Demand and Traditional TV
The second role of the Online Safety Commissioner would be to apply European Law to video sharing. The new law requires significant changes to the way in which Ireland regulates audio visual content, both offline and online, including:
- Ensuring that Video Sharing Platforms have sufficient measures (e.g. parental controls and age-verification) in place.
- Ensuring that Video Sharing Platforms have a complaints mechanism in place where a user can make a complaint regarding content which is hosted on the service.
The Online Safety Commissioner will certify that the measures which a service has in place are sufficient, and review the efficacy of the measures on an on-going basis. This could involve, for example, conducting audits of the measures which the services have in place or a more direct review of a company’s content moderation teams as they are operating.
Under EU Law, the Online Safety Commissioner would be required to regulate all video sharing platforms that are based in Ireland.
The AVMS Directive also requires a number of other changes in the regulation of traditional TV on-demand services, e.g. RTÉ Player, Virgin Media Player, iTunes etc, including aligning the rules and requirements for traditional TV and on-demand audiovisual media services and requiring a 30% quota of European Works on on-demand audiovisual media services.
The revised Directive also allows an EU country to levy revenues a traditional TV service or an on-demand audiovisual media service makes in that country even if it is based in another EU country.
Short Consultation Stage
The Minister also announced that he would today commence a short six week consultation period on the options which he has set out in his speech today.
The Minister said:
In announcing that I wanted to reform the role played by religion in the school admissions process, I believe that the process of a short consultation, followed by finalising our approach and developing the necessary legal changes, worked very well in enacting change. I hope to use a similar process here.”
I urge all parents and students, all teachers, all industry and groups who have views on these issues and who have concerns about possible impacts to make their views known so that we can take them into account as we develop legal proposals which are implementable.
The Minister concluded by saying “following the consultation period I will bring a draft heads of bill to government setting out a detailed plan for how we will make progress.”
The consultation will go live on the Department’s website later today and will be available here