AVMS Directive consultation

The 2018 Audiovisual Media Services Directive has to be transposed into UK law, assuming that the UK remains within the EU. The government (DCMS) is now consulting on the requirements for video sharing platforms (VSPs). The inclusion of user generated content of this kind is the most significant change of scope from the previous directive, which targeted video on demand services only.

Currently, "On-demand programme service providers" (ODPSs) are regulated in the UK by Ofcom under the 2010 Directive.

The 2018 Directive amends the 2010 Directive, so numbering within the new Directive currently shows where it amends or replaces text in the prior Directive.

Who and what is regulated

The definition targets sites that are designed to help people share video, but only regulates the video content within them:

where the principle purpose or a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos or both to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks... and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing[1]

Country of origin principle

Article 28a outlines the country of origin principle meaning that VSPs should be regulated in the country where they are established within the EU.[2]

The government says that VSPs that are not normally regulated by the UK's authority (Ofcom) may be regulated within the UK for additional requirements such as the Digital Economy Act 2016's age verification scheme through a derogation. The terms of such a derogation are set out in Article 3.[3]

The derogation regime however relates to individual sites disregarding their duties. On a simple reading, it appears that the AVMS Directive would need to require that age verification for adult content be in place, and that the video site(s) were not complying with these measures.

What is not regulated

The UK government says that:

Recital 6 of the revised directive also sets out what is not intended to be covered, ​for example video clips embedded in the editorial content of electronic versions of newspapers and magazines and animated images such as GIFs. It also sets out that the definition of VSPs does not cover non-economic activities, such as the provision of audiovisual content on private websites and non-commercial communities of interest.[4]

Changes needed

Regulator or self regulation

The Directive envisages that self-regulation will be dealing with many of the requirements in many member states. It therefore proposes that national regulators should act as a back stop regulator, being able to recognise or dismiss self-regulatory bodies to deliver the requirements of the directive.

In the UK's case, because the government expects the regulation to be superseded, the government expects Ofcom to take on the work without seeking self-regulatory bodies, with the exception of the advertising requirements, which it is recommending be dealt with by the Advertising Standards Authority.[5]

Age Verification of users and content restrictions

Material that is pornographic, contains incitement to violence or hatred and material promoting terrorism are subject to restriction under the Directive. Pornography must restrict minors from viewing it. Other material must be subject to takedown ("appropriate measures to protect the general public"):

Member States shall ensure that video-sharing platform providers under their jurisdiction take appropriate measures to protect:

(a) minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development in accordance with Article 6a(1);
(b) the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter;
(c) the general public from programmes, user-generated videos and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence as set out in Article 5 of Directive (EU) 2017/541, offences concerning child pornography as set out in Article 5(4) of Directive 2011/93/EU of the European Parliament and of the Council (*1) and offences concerning racism and xenophobia as set out in Article 1 of Framework Decision 2008/913/JHA.[6]

Age verification

It is not clear that age verification of users is required by the Directive, and is certainly not required in all cases. Ofcom currently require "specially restricted material" to be accessible only after age verification.

Ofcom is intending to amend its regulation of video services to align their definitions of pornographic content with the DEA and BBFC guidance.[7]

Other material would not be subject to age verification.

Other appropriate measures

The Directive provides a list of appropriate measures for content restriction. These:

… measures shall be practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided. Those measures shall not lead to any ex-ante control measures or upload-filtering of content which do not comply with Article 15 of Directive 2000/31/EC. For the purposes of the protection of minors, provided for in point (a) of paragraph 1 of this Article, the most harmful content shall be subject to the strictest access control measures. Those measures shall consist of, as appropriate:

(a) including and applying in the terms and conditions of the video-sharing platform services the requirements referred to in paragraph 1;
(b) including and applying in the terms and conditions of the video-sharing platform services the requirements set out in Article 9(1) for audiovisual commercial communications that are not marketed, sold or arranged by the video-sharing platform providers;
(c) having a functionality for users who upload user-generated videos to declare whether such videos contain audiovisual commercial communications as far as they know or can be reasonably expected to know;
(d) establishing and operating transparent and user-friendly mechanisms for users of a video-sharing platform to report or flag to the video-sharing platform provider concerned the content referred to in paragraph 1 provided on its platform;
(e) establishing and operating systems through which video-sharing platform providers explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (d);
(f) establishing and operating age verification systems for users of video-sharing platforms with respect to content which may impair the physical, mental or moral development of minors;
(g) establishing and operating easy-to-use systems allowing users of video-sharing platforms to rate the content referred to in paragraph 1;
(h) providing for parental control systems that are under the control of the end-user with respect to content which may impair the physical, mental or moral development of minors;
(i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users' complaints to the video-sharing platform provider in relation to the implementation of the measures referred to in points (d) to (h);
(j) providing for effective media literacy measures and tools and raising users' awareness of those measures and tools.[8]

Redress and complaints

Article 28b(7) outlines the need to provide dispute resolution for users, where material is removed or not removed.

Member States shall ensure that out-of-court redress mechanisms are available for the settlement of disputes between users and video-sharing platform providers … Such mechanisms shall enable disputes to be settled impartially and shall not deprive the user of the legal protection afforded by national law.[9]

The government says that:

"Given the sheer volume of audio-visual content shared on video sharing platforms online, it would not be practicable to establish a similar complaints regime as exists for linear broadcasting … The regulator, therefore, would oversee the requirements for VSPs to have an effective and easy to access complaints function, and to have an external independent appeals process.[10]

Notification

As the regulator needs to know who it is regulating, the Directive provides for different ways for services to make themselves known to the regulator. The government says that it prefers a 'notification regime', where a service will notify Ofcom that it exists and needs to be regulated.[11]

Interaction with the Online Harms White Paper

The government believes that their own measures will cover video sharing sites, and as the requirements go beyond the AVMS Directive's minimums, will supercede it. They therefore regard these changes as temporary.

External links

References

  1. AVMSD Article 1, paragraph 1(aa)
  2. AVMSD Article 28a, paragraph 1(aa)
  3. AVMSD Article 3
  4. VSP Consultation document, paragraph 5
  5. VSP consultation, para 13
  6. Article 28b paragraph 1
  7. ‘Specially restricted material’ and Age Verification Guidance for Providers of On-Demand Programme Services
  8. AVMS Directive 2018, Article 28b paragraph 3
  9. AVMS Directive 2018, Article 28b paragraph 7
  10. VSP Consultation, paras 29-30
  11. VSP Consultation, paras 23-25