Disinformation and 'fake news': Final Report
Digital, Culture, Media and Sport Committee | 14 February 2019
This is the Final Report from the DCMS Committee in an inquiry on disinformation that has spanned ove 18 months, convering individuals' rights over their privacy, how their political choices might be affected and influened by online information, and interference in political elections both in this country and across the world.
Recommendations
- A new category of tech company is formulated, which tightens tech companies' liabilities, and which is not necessarily either a 'platform' or a 'publisher'. This approach would see the tech companies assume liability for content identified as harmful after it has been posted by users. [1]
- Clear legal liabilities should be established for tech companies to act against harmful or illegal content on their sites: [2]
- This should be achieved through a compulsory Code of Ethics overseen by an independent regulator.
- The regulator should have statutory powers to monitor relevant tech companies.
- The Code of Ethics should be similar to the Broadcasting Code issued by Ofcom.
- The Code of Ethics should be developed by technical experts and overseen by the independent regulator.
- The independent regulator should hve statutory powers to obtain any information from social media companies relevant to its inquiries including:
- Capability to check what data is being held on an individual user, if a user requests such information.
- Access to tech companies' security mechanisms and algorithms, to ensure they are operating responsibly.
- Accessible to the public and be able to take up complaints from members of the public about social media companies.
- Support the position that inferred data should be as protected under law as personal information.[3]
- This should be done by extending protections of privacy law byond personal inforamtion to include models used to make inferences about an individual.
- A levy should be placed on tech companies operating in the UK to support the enhanced work of the Information Commissioner's Office.[4]
- The new independent system and regulation that is recommended should be adequately funded via a levy on tech companies operating in the UK.
- There should be full disclosure of the targeting used as part of advertising transparency. The Government should explore ways of regulating the use of external targeting on social media platforms. [5]
- Agree with the ICO's proposal that a Code of Practice, which highlights the use of personal information in political campaigning and applying to all data controllers who process personal data for the purpose of political campaigning, should be underpinned by primary legislation.[6]
- The Government should be conducting analysis to understand the extent of hte targeting of voters, by foreign players, during past elections. Legislation should be in line with the latest technological developments, and should be explicit on the illegal influencing of the democratic process by foreign players.[7]
Quotes
Elizabeth Denham, Information Commissioner, Evidence during Committee November 2018 [8]
We dont want to use the same model that sells us holidays and shoes and cars to engage with people and voters. People expect more than that. This is a time for a pause to look at codes, to look at the practices of social media companies, to take action where they have broken the law. For us, the main purpose of this is to pull bakc the curtain and show the public what is happening with their personal data.
Ashkan Soltani, independent researcher and consultant, former primary technologist at the Federal Trade Commission, Evidence during Committee November 2018 [9]
In short, I found that time and time again Facebook allows developers to access personal information of users and their friends, in contrast to their privacy settings and their policy statements. This architecture means that if a bad actor gets a hold of these tokens [...] there is very little the user can do to prevent their information from being accessed. Facebook prioritises these developers over their users.
Digital, Culture, Media and Sport Committee, Disinformation and 'fake news': Final Report[10]
Participating in social media should allow more pause for thought. More obstacles or 'friction' should be both incorporated into social media platforms and into users' own activities - to give people time to consider what they are writing and sharing. Techniques for slowing down interaction should be taught, so that people themselves question both what they write and what they read - and that they pause and think further, before they make a judgement online.