Skip to main content

Video-sharing Platforms: BBFC Ratings

Volume 810: debated on Thursday 4 March 2021

Question

Asked by

To ask Her Majesty’s Government what plans they have to mandate the use of British Board of Film Classifications ratings for user-generated content on video sharing platforms.

My Lords, the British Board of Film Classification’s age ratings are currently used by a number of video-on-demand providers. Although adoption is voluntary, we welcome their use. The video-sharing platform regime, for which Ofcom is the regulator, came into force on 1 November 2020. UK-established video-sharing platforms must now take appropriate measures to protect the public, including minors, from illegal and harmful material. Video-sharing platforms may adopt age ratings as an appropriate measure; however, they are not obliged to do so.

I thank my noble friend the Minister for that reply, but there is a wider issue with BBFC certification. The recently launched Disney streaming service ran a documentary originally certificated by the BBFC as suitable for those aged 18 and over. Disney chose to self-certificate it as suitable for 12 and over. Believe me, some scenes in that documentary were truly horrific. To protect children, will the Government, as a matter of urgency, bang heads together and get every streaming service to sign up to the BBFC system, which is tried and trusted?

I agree with my noble friend’s last remark about this system being trusted. The Government have great trust in the BBFC’s best-practice age ratings. On his suggestion that we bang heads together, we aim to approach things more gently, but we are actively engaging with the industry to encourage other platforms to adopt the BBFC’s ratings across all their content, and will keep the evidence for legislation in this area under review.

My Lords, I declare a past interest as a member of the first British video classification council, chaired by Lord Harewood. It was difficult then, so I ask the Minister how parents can be expected to manage their children’s screen time today, when there is such a lack of regulation and a slow government response.

My noble friend makes a valid point, and I know that parents have had extraordinary challenges in this area, particularly over the last year. She is aware that we are developing a media literacy strategy and that, last year, we published guidance on online safety for children. We should also remember that our broadcasters have educated, entertained and informed our children in the last year.

The Government’s response to the online harms White Paper says that:

“The regulator will be required to have regard to the fact that children have different needs at different ages when preparing codes of practice relevant to the protection of children.”

What powers will Ofcom have to provide sufficient oversight and ensure enforcement of these additional protections? Will they be set out in the online safety Bill?

I assure the noble Viscount that they will be set out in the legislation. Ofcom will have wide-ranging powers to tackle both illegal and harmful content. I am happy to write to him with more detail.

In December, the Minister spoke of the voluntary nature of the BBFC scheme, which she reminded us of earlier for video-on-demand services. One of the strengths of the BBFC’s ratings is that they are well understood by parents and children alike. The same cannot be said for the inconsistent approaches adopted by platforms offering user-generated content. How do the Government plan to balance the undeniable need for change, to which noble Lords have referred, with their wish to minimise regulation, which is clearly not working at the moment?

The noble Lord will be aware that the adoption of BBFC ratings, particularly by Netflix, is a relatively recent development, so we have not yet made an assessment of its impact on both accessibility of content and other streaming services. As I said to my noble friend Lord Grade, we are keeping this under review.

My Lords, YouGov research confirms that 82% of parents and 73% of children want BBFC age ratings displayed on user-generated content on these video-sharing platforms. Given new duties under the revised audio-visual media services directive to protect children, and with the promised duty of care, is not actual regulation from the Government needed to make sure that these platforms work with the trusted ratings from the BBFC to better protect children? Are not the Government running against the tide?

We do not believe that we are running against the tide. The online harms legislation, which we have discussed extensively in this House and which I know we will debate in great detail in future, will make us a world leader in this regard.

My Lords, sensibly regulating the wild west of user-generated content on the internet is essential, but potentially a whack-a-mole exercise, given the risk that it simply displaces activity elsewhere. How will the DCMS work with Ofcom to ensure that its implementation of the video-sharing platform regime develops understanding of how to regulate online services, in advance of the online safety Bill coming into force?

My noble friend makes an important point. By the implementation of the video-sharing platform regime, as he suggests, Ofcom will build its experience in regulating harmful content while balancing freedom of expression. I understand that Ofcom is already preparing for its new responsibilities in relation to online harms by bringing in new technology and people with the right skills.

My Lords, I declare an interest in that for 10 years I was a vice-president of the BBFC. While the adoption of the BBFC’s age ratings is currently voluntary, does the Minister welcome the fact that Netflix announced on 1 December last year that it had become the first platform to achieve complete coverage of its content under the BBFC’s ratings, and that a number of other video-on-demand platforms use BBFC ratings for some of their content, including Amazon Prime Video, Apple TV+, Curzon Home Cinema and BFI Player? Will she continue to engage with the industry to encourage other platforms to adopt the BBFC’s ratings across all their content?

Absolutely. The Government welcome Netflix’s decision and, as I mentioned earlier, we continue to work with a number of the providers in this area.

I refer the House to my interests on the register. Age rating is just one of the many tools needed to build the digital world that children deserve, but it is hugely important to children and families that are looking to curate an age-appropriate experience. Is the Minister aware that Apple and Google app stores routinely advertise apps and games as suitable for four-plus and nine-plus for services whose own terms and conditions state that they are only for 16-plus or adult use? This means that a child or parent will download an app on the false understanding that it is age appropriate. Does she agree that there is little point age-rating individual pieces of content if the largest companies in the world continue to mislabel products and services on an industrial scale?

I would be happy to discuss the matter that the noble Baroness raises with the relevant platforms and the Video Standards Council. We encourage online store fronts to follow the BBFC best practice for labelling online apps, which includes signing up to the international age rating coalition system.

My Lords, I declare my interest as vice-chair of the All-Party Parliamentary Group on Esports. Does the Minister agree that in protecting children’s rights, the views of gamers, children and teachers should be taken into account when considering a combination of age labelling, filters and parental controls, and that tools such as URI which provide age ratings for UGC available via online video-sharing platform services are exceptionally helpful in this context?

My noble friend is right that the views of children, gamers and teachers are important. Under the video-sharing platform regime, UK-established platforms will be required to take appropriate measures to protect all their users from illegal content and minors from harmful content. Those measures could include a combination of age labelling, filters, parental controls and technical tools.