Digital Charter

(Redirected from Digital Charter 2017)

The Digital Charter refers to a policy paper introduced by the Conservative Government in 2017, and first published in early 2018. The full policy paper can be viewed on GOV.UK here, and this Wiki page aims to outline some of the relevant sections of the document.

Early Proposals

Details of the early proposal stages of the Charter can be found on the Digital Charter/Proposals page.

Stated Aims of the Charter

The internet can be used to spread terrorist material; it can be a tool for abuse and bullying; and it can be used to undermine civil discourse, objective news and intellectual property. Citizens rightly want to know that they will be safe and secure online. Tackling these challenges in an effective and responsible way is critical for digital technology to thrive.

The Digital Charter is our response: a rolling programme of work to agree norms and rules for the online world and put them into practice. In some cases this will be through shifting expectations of behaviour; in some we will need to agree new standards; and in others we may need to update our laws and regulations.[1]

Guiding Principles

The Charter states that the Government will be guided by these principles when trying to achieve their aims:[1]

  • the internet should be free, open and accessible
  • people should understand the rules that apply to them when they are online
  • personal data should be respected and used appropriately
  • protections should be in place to help keep people safe online, especially children
  • the same rights that people have offline must be protected online
  • the social and economic benefits brought by new technologies should be fairly shared

Work Programme

The Charter outlines a "Work Programme", which outlines the areas of current and emerging technologies that the Charter is primarily interested in.[1]

They are copied below:

  • Digital economy – building a thriving ecosystem where technology companies can start and grow.
  • Online harms – protecting people from harmful content and behaviour, including building understanding and resilience, and working with industry to encourage the development of technological solutions.
    • The paper references the Internet Safety Strategy as an example of the Government already having made progress in this area.
  • Liability – looking at the legal liability that online platforms have for the content shared on their sites, including considering how we could get more effective action through better use of the existing legal frameworks and definitions.
  • Data and artificial intelligence (AI) ethics and innovation – ensuring data is used in safe and ethical way, and when decisions are made based on data, these are fair and appropriately transparent.
    • The paper references the opening of the new Centre for Data Ethics and Innovation as an example of the Government already having made progress in this area.
  • Digital markets – ensuring digital markets are working well, including through supporting data portability and the better use, control and sharing of data.
    • The paper references the Data Protection Bill 2017 as an example of the Government already having made progress in this area.
  • Disinformation – limiting the spread and impact of disinformation intended to mislead for political, personal and/or financial gain.
  • Cyber security – supporting businesses and other organisations to take the steps necessary to keep themselves and individuals safe from malicious cyber activity, including by reducing the burden of responsibility on end-users.

Collaboration with Industry

The Charter will not be developed by government alone. We will look to the tech sector, businesses and civil society to own these challenges with us, using our convening power to bring them together with other interested parties to find solutions.[1]

Green Paper and White Paper

The government's Green Paper consultation on Internet Safety Strategy ran in 2017.

The government published its response to the Green Paper consultation in mid 2018. This included Draft code of practice for providers of online social media platforms and a Draft transparency reporting template.

The response signalled a change in policy, towards a focus on vulnerable groups, and on a new legislative framework to govern social media companies approach to harmful content.

The response  set out specific goals, including changes to liability for content online:

Online platforms need to take responsibility for the content they host. They need to proactively tackle harmful behaviours and content. Progress has been made in removing illegal content, particularly terrorist material, but more needs to be done to reduce the amount of damaging content online, legal and illegal.

We are developing options for increasing the liability online platforms have for illegal content on their services. This includes examining how we can make existing frameworks and definitions work better, as well as what the liability regime should look like in the long-run.[2]

UK Council for Internet Safety

DCMS has announced a replacement for the UK Council for Child Internet Safety which will be broader in scope.

Priority areas of focus will include online harms such as cyberbullying and sexual exploitation; radicalisation and extremism; violence against women and girls; hate crime and hate speech; and forms of discrimination against groups protected under the Equality Act, for example on the basis of disability or race.[3]

An Online Harms white paper is expected in 2019.

Other contributions

Ofcom report

DCMS Committee

References

  1. 1.0 1.1 1.2 1.3 Digital Charter, gov.uk 25 January 2018
  2. Internet Safety Strategy green paper, DCMS October 2017; government response, May 2018
  3. New Council for Internet Safety in the UK, DCMS 6 July 2018