Digital Economy Act 2017

The Digital Economy Act 2017 received Royal Assent on 27 April 2017. The Act includes provisions about electronic communications infrastructure, restricting access to online pornography, online copyright infringement, data sharing among government departments and public authorities, functions of Ofcom, regulation of the BBC, and regulation of direct marketing.

The Digital Economy Bill was introduced by then-Secretary of State for Culture, Media and Sport John Whittingdale who was later replaced by Karen Bradley.

ORG campaigned on three areas of the Act - age verification for online pornography, online copyright infringement offences and data sharing in government.

Age verification

Core duty

As mentioned in Queen’s Speech on 21st June 2017, Part 3 of the DE Act is one of the main measures claimed as safeguarding online safety for children.[1][2]

Section 14 (1) prohibits a person making pornographic material available on the internet to persons in the UK on a commercial basis in such a way that does not secure the material to make it normally inaccessible by persons under 18.[3]

Thus, the objective of this clause is to prescribe and prevent an access to pornographic material online by persons under 18. For this purpose, the age-verification regulator is designated and given a mandate (see below).

Scope of application

Section 14 (1)[4] does not appear to cover pornographic material provided on a non-commercial basis. What then happens to a person who makes pornographic material available on a social network service (SNS)?

YouTube

When uploading a video work on a YouTube platform, one can choose whether or not to enable advertisement on the video. If he /she enables advertising, the presence of a commercial purpose would apply. On the other hand, if it lacks advertising, it would be difficult to establish an intention to make profits from the material.

Facebook, Twitter

When users of these services upload the contents on each of the platforms, except in a few distinct examples, it could be difficult to show a commercial purpose on the part of the platform itself.

A commercial purpose could be argued if the contents lead to another website that can be accessed following a payment or is accompanied by an explicit advertisement that encourages the purchase of the goods or services related to the uploaded contents.

In such case, although Facebook or Twitter do not publish the contents themselves, they could be held liable for the 'making available' conduct.[5]

Section 14 (5) mentions Electronic Commerce Directive Regulations 2002 and states that Regulation 17 to 20 and 22 applies to this Part. (despite Regulation 3(2)[6] which makes it impossible for the Regulations to affect any law enacted after it came into force).

These Regulations protect services such as publishers of user-generated content, including P2P file-sharing platforms, from becoming liable for breaches of law caused by contents being uploaded by their users, as long as the services do not actively get involved in that conduct, do not have actual knowledge of illegality or remove illegal contents expeditiously when found. (See, in particular, Regulations 17[7], 18[8] and 19[9])

Thus, although the government argued that social media sites are not included, it is possible that the Act could be interpreted differently.

Definitions

Pornographic material

The meaning of Pornographic material (other than extreme pornographic material) is explained in Section 15 of the Act.[10] There are nine different types of materials but there are two key components.

The first, the easiest to understand criteria, is that the material has an R18 certificate issued under the authority of the BBFC (Section 4[11], Section 7(2)[12]of Video Recording Act , DE Act 15(2)[13]).

Second, in the absence of an R18 certificate, a video work can be a pornographic material if it is reasonable to assume:

(i) that it was produced solely or principally for the purposes of sexual arousal, and
(ii) that its inclusion was among the reasons why the certificate was an R 18 certificate (Section 15(1) (e), (f), (g), (h) and (i))[14]

This criterion applies to the video work or material that is included in the work BBFC had determined not to be suitable for classification (Section 15 (1) (g), (h))[15]and any other material which might have never been assessed by BBFC. (Section 15(1) (i))[16]

The second part appears to apply to extracts of classified videos, such as preview clips.

Extreme pornographic material

Under Section 22 (1), Extreme pornographic material means material;

(a) whose nature is such that it is reasonable to assume that it was produced solely or principally for the purposes of sexual arousal (the same as the above (i)), and
(b) which is extreme.[17]

To be extreme within the meaning of Section 22 (1) of the DEA, Section 63(7) or 7(A) of the Criminal Justice and Immigration Act 2008 must apply. This states that the contents need to portray;

(a) any of such acts as (on the condition that a reasonable person looking at the image would think that the persons or animals within the contents were real.) -
(1) threatens a person's life,
(2) results, or is likely to result, in serious injury to a person's anus, breasts or genitals,
(3) involves sexual interference with a human corpse,
(4) depicts an act of intercourse or oral sex with an animal (whether dead or alive) performed by a person,
(5) involves the non-consensual penetration of a person's vagina, anus or mouth by another with the other person's penis, or
(6) involves the non-consensual sexual penetration of a person's vagina or anus by another with a part of the other person's body or anything else[18]

These criteria do not apply to a video work which BBFC has already classified (a classified video work). Nevertheless, if material in the video work is extracted and the nature of that extracted part is

“such that it is reasonable to assume that it was extracted (with or without other material) solely or principally for the purposes of sexual arousal,”

the material in question could be recognised as “extreme” under Section 22 (3) and (4).[19]

Age-verification regulator

Relationship with SoS

The Secretary of State (SoS), through a notice, may designate one person or any two or more persons as the age-verification regulator in Section 16(1)).[20] The SoS may, at any time, revoke an existing designation or make a new one (Section 16(2))[21] and where more than one persons are designated as the regulator, each person can be responsible for a different function in Section 16(3)).[22] All the designation must be approved by each House of Parliament under Section 17 (5)).[23]

The regulator carries out the functions either provided in Part 3 (Online Pornography) or specified in the designating notice (Section 16 (1) (a), (b)).[24] The regulator may be funded by SoS in relation to carrying out its functions. (Section 16(8))[25] Moreover, in exercising its functions, the regulator must observe the guidance issued and revised by SoS (Section 27 (1), (2) and (3)).[26]

Functions

1. Seeking information from ISPs

The Age-verification regulator may, through a notice, require any information which the regulator needs to exercise or decide to exercise its function under the relevant provisions of Section 18(1)).[27]

The notice can be issued toward an ISP or any other person who the regulator believes to be or have been involved in making available pornographic material available on the internet on a commercial basis to the persons in the UK. Anyone falling within these criteria is designated as “a relevant person.” (Section 18(3))[28] In other words, this provision allows the regulator to compel information from ISPs and others when conducting research as to who might be a “relevant person”.


2. Addressing contraventions
Financial penalty

There are two cases when the regulator may impose a financial penalty. Firstly, when the regulator found any contravention of Section 14 or non-compliance with the notice under Section 18[29], the regulator may impose a financial penalty on the person who is responsible for the contravention or non-compliance. (Section 19 (1))[30] Secondly, where the person has failed to comply with an enforcement notice, the regulator may impose a financial penalty. (Section 19(10))[31]

No financial penalty may be imposed in any of the following cases;

(1) the contravention has ceased
(2) three years have passed since the beginning of the contravention
(3) one year has passed since the regulator became aware of the contravention (if sooner than (2)) (Section 19 (5), (6))[32]

This does not apply to the non-compliance with Section 18.[33] In conjunction with Section 19 (12)[34], a relevant person may have repeatedly imposed further financial penalties on them until they hand over the information sought by the regulator in both of the assumed situations. The amount of the penalty is determined by the regulator. (Section 20 (1))[35] The upper limit of the amount is whichever greater £250,000 or 5% of the person’s qualifying turnover. (Article 20 (2))[36] There are no clauses that dictate the reduction in these figures corresponding to the total amount of the penalty or a total number of imposition.


Enforcement notice

An enforcement notice may be issued in respect of the contravention of Section 14 (1).[37] (Section 19 (2))[38] The notice tells the person in question that he / she is contravening Section 14 (1) and thus requires the same person to stop the contravention. (Section 19 (7))[39] The notice contains the reasons for the decision and a time limit,(Article 19 (8))[40] with which the person must comply. (Article 19 (9))[41] A failure to comply with an enforcement notice could lead to a financial penalty. (Section 19 (10))[42] and could be enforced by the regulator through the civil proceedings for;

(a) an injunction
(b) specific performance of a statutory duty (Section 45 (2) of the Court of Session Act 1988)
(c) any other appropriate remedy or relief (Article 19 (11))[43]


Other

A financial penalty and an enforcement notice can be exercised independently. (Section 19 (4))[44] Before the regulator makes a determination under Section 19 (1) or (2), it must give an opportunity to the person to make representations. (Section 19 (3))[45]

3. Blocking contents

Where the regulator finds the non-complying person who is;

(a) contravening section 14(1), or
(b) making extreme pornographic material available on the internet to persons in the United Kingdom,

it may give a notice to any ISPs,(Article 23 (1)) which must contain the following information on:

  • who is non-complying person
  • whether it relates to Section 23 (1) (a) or (1) (b), or both
  • what steps need to be taken or what arrangements to be put in place to prevent access to the offending materials using the service it provides, by the persons in the UK
  • the arrangements for appeal (See, Section 16 (6))[46]
  • further such particulars as considered appropriate by the regulator (Section 23(2))[47]

ISPs, if given such notice, bear the duty to comply with it, which is enforceable in the same way as an enforcement notice. (Section 23 (9)) Prior to the issuance of the notice under Section 23 (1)[48], the regulator must

(1) inform SoS the decision to do so and
(2) notify the non-complying person, explaining the reason for the determination of its decision, how to resolve the issue at stake and about the arrangements for appeals. (Section 23 (11))[49] This notice cannot be issued if it is likely to be detrimental to national security, and prevention or detection of serious crime, in the same way as stated in Section 263 (1) of IP Act[50] and listed in Schedule 3 of the Sexual Offences Act 2003[51]. (Section 24 (1), (2) and (3))[52]
4. Other service providers

Not only ISPs but also "payment-services provider" and "ancillary service provider" could receive the notice under Section 21 when the regulator finds a non-complying person in the same manner as specified by Section 23.[53] The notice must contain the following information on:

  • who is the non-complying person
  • whether it relates to Section 21 (1) (a) or (1) (b), or both
  • further such particulars as considered appropriate by the regulator (Section 21(2))

When such notice has been given to the providers, the non-complying person must also be notified. (Section 21 (3))[54]


Payment-service provider is a person who appears to the regulator to provide money transfers in relation to the access to pornographic work by the non-complying person. (Section 21(4))[55] Ancillary service provider is a person who appears to the regulator, to provide services which enable or facilitate the non-complying person’s wrongdoing. The ancillary service provider also means anyone who makes an advertisement of any goods or services on the non-complying website. (Section 21 (5) (a), (b))[56] Providing a device or other equipment is excluded from the purpose of the above provision. (Section 21 (6))[57]

5. Issuing guidance

For the clarification of Section 14 (1)[58], the regulator may issue the guidance: From the service providers’ (ISPs, payment-services providers and ancillary service providers) perspectives, this means they will be able to better understand what “making available on the internet” means and under what circumstances they are considered involved in it. (Article 14 (1)[59], 21 (1), (5)[60]) This guidance may not be published by the regulator alone but subject to oversight by SoS and both Houses of Parliament. (See, generally, Section 25)[61]

Potential risks / challenges

Privacy

Creating an obligation for websites to verify the age of their users will also create a database of porn users and their viewing habits. Sufficient privacy safeguards were not introduced in the Act, they merely feature in the Codes of Practice as a vaguely worded requirement for age-verification tool providers to offer strong privacy protections for users.

The Act does not refer to what age-verification methods that can be used by websites to verify age. The Codes of Practice also mention that they will not limit what technology can be used as an age-verification tool. The lack of specifics in the primary and secondary legislation on AV tools makes it possible for those tools with poor privacy standards to enter the AV tool market.

Considering the recent hacking attacks on pornographic websites[62][63] and leaks of personal information, operating an age-verification system with poor privacy standards could run a similar risk to individuals' privacy.

MindGeek, owner of the some of the biggest pornographic websites, previously announced that they are developing their own AV tool. Their prime position on the porn market gives them an advantage of already controlling a significant portion of the market. Therefore it is likely their AV tool could be the most widespread one and also financially accessible to smaller pornographers in return for the data on their customers' viewing preferences.

The DEAct makes it possible for MindGeek to develop a vast database of people's porn habits without strong obligations to protect people's privacy.

Censorship

The power to censor non-complying websites in the Act is not limited. The Government made statements previously that the power would be limited to the first 50-100 most popular websites. The BBFC and government attempted to assure people privately that the number of blocks would be low, and based on market share.

However, this is not the reflected in the Act. How many websites end up blocked is purely an issue of the Government's policy preference and financial choice. The lack of specifics in the legislation makes it possible for the government to be lobbied at a later stage block vast numbers of entirely legal sites.

Enforceability

As ORG noted elsewhere before, this age-verification system would not work effectively, because it is not likely to be ubiquitous: a majority of the websites will not implement the scheme.

Even if the majority of the owners of the relevant websites implement age verification systems, it is unlikely to reduce the amount of pornographic material available and accessible to minors. There are technical measures available to circumvent the scheme, e.g. Virtual Private Network. This could lead to decrease in legal pornography.

Cost

As mentioned above, the regulator may impose a financial penalty on ISPs or any other person who has made pornographic materials available on the internet to the persons in the UK. (this does not necessarily mean that the material is also available to minors) Such a pecuniary sanction could be so burdensome on small ISPs and other companies that they would either give away such information as sought by the regulator or give up their business.

Proportionality

Accordingly, establishing the age-verification scheme contemplated in DE Act would not be an appropriate solution to the online safety for minors. The main reason is that it is detrimental to the privacy and freedom of speech, while not conducive to the online safety for minors.

Online criminal copyright sentences

Increased sentence

Part 4 of DE Act, comprising only four clauses, refers to copyright protection. Section 32 (3) and (5)[64] are directly involved in increasing the sentences for the online copyright infringement by amending Section 107[65] and 198[66] of the Copyright, Design and Patents Act 1988. To be concrete, insomuch as online copyright infringement provided in these Sections, a liable person could be sentenced up to 10 years maximum. (on conviction indictment) Before the amendment the upper limit 2 years. Not all the offences under these Sections are subject to this amendment.


Offences thereby people could be jailed for 10 years

Section 107 polices copyright infringement by communication to the public. This Section criminalises the act of, without a licence from the copyright owner;

  • making a sale or hire
  • importing into the UK other than for one’s private and domestic use
  • distributing at work, or
  • distributing, other than at work, to such an extent as to be in prejudice of the copyright owner.

content that infringes a copyright, which the person knows or has reason to believe.


Section 198 specifically reserves “making available rights”[67] for the authors, producers and performers of “phonogram” (music, songs, recording of sound other than those as incorporated in movies or TV programmes)[68] This Section prohibits the act of, without the consent of the performer or the person having the recording rights;

  • making a sale or hire, or
  • distributing at work

a recording that infringes a copyright, which the person knows or has reason to believe.

How to be found guilty

Section 32 (2) and (4) of DE Act[69], respectively amends Section 107 and 198 of CDPA. They rule what constitutes “infringing copyright.” In both cases, two conditions are laid out for a person to be liable.

The first condition requires the person to have the knowledge about his/ her infringing copyright *in the work or having reason to believe as such

(*In the case of making available right, the situation is not limited to the work)

The second condition is to have any of the followings

1. intention to make a profit for him- / her- self or another person
2. knowledge about causing loss to the copyright owner or exposing the copyright owner to the risk of loss (Section 32 (2) and (4) of DE Act[70], Sections 107 (2A) and 198 (1A) of CDPA)

Both loss and gain in this sentence relate only to financial terms, excluding psychological dimension and loss covers losing what one has possessed and what one might have obtained. (Section 32 (2) and (4) of DE Act, Sections 107 (2B) and 198 (1) of CDPA) Hence, for example, a loss is caused by not paying a licence fee. (See, this video of Jim Killock)


Potential risks / challenges

A grave concern over the copyright infringement revision lies in the lack of threshold on which the person is judged to be guilty or not. Newly introduced Sections are silent about how much money is it that makes a person liable for a copyright infringement. Although it would not be highly likely to be prosecuted or convicted based on minor infringements, there is currently no guarantee for such an assumption. What is more worrisome is more people more frequently get sued and charged with online copyright infringements but get extorted money due to the fear of imprisonment.This lack of a numerical metric could encourage “copyright trolls”, which often targets small-scale file sharing services, asking for a payment in the threatening letter. An increased sentence without a clarification on this point is likely only to empower these trolls.

Data Sharing

Internet filters

See also: Internet filters in the Digital Economy Act and EU Net Neutrality Regulation

Section 104: Internet filters

A Government amendment in the Lords[71] was added to the DEA to allow for ISPs to use blocking or filtering if it was provided for in their own Terms of Service. It is now Section 104 of the DEA.[72]

104 Internet filters

(1) A provider of an internet access service to an end-user may prevent or restrict access on the service to information, content, applications or services, for child protection or other purposes, if the action is in accordance with the terms on which the end-user uses the service.

(2 )This section does not affect whether a provider of an internet access service may prevent or restrict access to anything on the service in other circumstances.

(3) In this section—

“end-user” means an end-user of a public electronic communications service, within the meaning given by section 151(1) of the Communications Act 2003;

“internet access service” has the meaning given by Article 2(2) of Regulation (EU) 2015/2120 of the European Parliament and of the Council of 25th November 2015 laying down measures concerning open internet access and amending Directive 2002/22/EC on universal service and users’ rights relating to electronic communications networks and services and Regulation (EU) No 531/2012 on roaming on public mobile communications networks within the Union.

Interaction with Net Neutrality regulations

The amendment was the UK Government's response to their concerns that parental filters – introduced without legislation by UK ISPs – were made illegal by the EU net neutrality regulation.

As Matthew Hancock said for the Government in the Commons,

On Report in this House, we agreed that the parental control filters on internet connections are a very important tool in protecting children from harmful material online. I agreed to ensure that the Bill was amended in the Lords to tackle concerns that the EU net neutrality regulation would render these controls, which have worked well, illegal. Lords amendment 245 delivers on that promise.[73]

As pointed out by Malcom Hutty at Linx,[74] the EU net neutrality regulation says: (emphasis added)

Article 3 (1). End-users shall have the right to access and distribute information and content, use and provide applications and services, and use terminal equipment of their choice, irrespective of the end-user’s or provider’s location or the location, origin or destination of the information, content, application or service, via their internet access service.

Article 3 (2). Agreements between providers of internet access services and end-users on commercial and technical conditions and the characteristics of internet access services such as price, data volumes or speed, and any commercial practices conducted by providers of internet access services, shall not limit the exercise of the rights of end-users laid down in paragraph 1.[75]

As long as the UK remains in the EU and/or the EEA, EU law takes precedence over UK law. It seems fairly clear that the EU law does not allow ISPs to write content blocking measures into their terms of service that would reduce the rights of end-users to access information and content. The national regulator – Ofcom in the case of the UK – would need to assess the market position of any ISPs implementing the filters. Seeing as all of the large UK ISPs use the filters it seems fairly clear that they have a large enough market share to be problematic with regard to the Regulation.

See also

External links

References

  1. Queen's Speech 2017
  2. Queen’s Speech 2017: background briefing notes
  3. Section 14 (1)
  4. Section 14 (1)
  5. Section 14 (5)
  6. Regulations 3 (2) of EC Directive Regulations 2002
  7. Regulations 17 of EC Directive Regulations 2002
  8. Regulations 18 of EC Directive Regulations 2002
  9. Regulations 19 of EC Directive Regulations 2002
  10. Section 15
  11. Section 4 of the Video Recording Act 1984
  12. Section7 (2) of the Video ecording Act 198
  13. Section 15 (2)
  14. Section 15 15(1) (e), (f), (g), (h) and (i)
  15. Section 15 (1) (g), (h)
  16. Setion 15 (i)
  17. Section 22 (1)
  18. Section 63(7) or 7(A) of the Criminal Justice and Immigration Act 2008
  19. Section 22 (3), (4)
  20. Section 16 (1)
  21. Section 16 (2)
  22. Section 16 (3)
  23. Section 17 (5)
  24. Section 16 (1) (a), (b)
  25. Section 16 (8)
  26. Section 27 (1), (2) and (3)
  27. Section 18 (1)
  28. Section 18 (3)
  29. Section 18
  30. Section 19 (1)
  31. Section 19 (10)
  32. Section 19 (5), (6)
  33. Section 18
  34. Section 19 (12)
  35. Section 20 (1)
  36. Section 20 (2)
  37. [http://www.legislation.gov.uk/ukpga/2017/30/section/14/enacted Section 14 (1)]
  38. Section 19 (2)
  39. Section 19 (7)
  40. Section 19 (8)
  41. Section 19 (9)
  42. Section 19 (10)
  43. Section 19 (11)
  44. Section 19 (4)
  45. Section 19 (3)
  46. Section 16 (6)
  47. Section 23 (2)
  48. Section 23 (1)
  49. Section 23 (11)
  50. Section 263 (1) of IP Act
  51. Schedule 3 of the Sexual Offences Act 2003
  52. Section 24 (1), (2) and (3)
  53. Section 23
  54. Section 21 (3)
  55. Section 21 (4)
  56. Section 21 (5) (a), (b)
  57. Section 21 (6)
  58. Section 14 (1)
  59. Section 14 (1)
  60. Section 21 (1), (5)
  61. Section 25
  62. FriendFinder hacking attack
  63. https://www.theinquirer.net/inquirer/news/2478759/xhamster-hack-380-000-accounts-exposed-in-porn-site-breach xKamster hacking attack]
  64. Section 32 (3) and (5)
  65. Section 107 of the CDPA
  66. Section 198 of the CDPA
  67. Article 14 of WIPO Performances and Phonograms Treaty (WPPT)
  68. Article 2 of WPPT
  69. Section 32 (2) and (4)
  70. Section 32 (2) and (4)
  71. Government amendments
  72. 104 Internet filters, Digital Economy Act 2017, legislation.gov.uk
  73. Digital Economy Bill Hansard, 26 April 2017
  74. Malcom Hutty, Linx
  75. Regulation (EU) 2015/2120, EUR-Lex, 25 November 2015