Committee on Standards in Public Life/Intimidation in Public Life

< Committee on Standards in Public Life

Their 17th report looks at intimidation in public life:

In recent years, the intimidation experienced by Parliamentary candidates, and others in public life, has become a threat to the diversity, integrity, and vibrancy of representative democracy in the UK.[1]

Its findings have been interpreted as presenting support for the Digital Charter.

A signifcant proportion of candidates at the 2017 general election experienced harassment, abuse and intimidation. There has been persistent, vile and shocking abuse, threatened violence including sexual violence, and damage to property. It is clear that much of this behaviour is targeted at certain groups. The widespread use of social media platforms is the most signifcant factor driving the behaviour we are seeing.[2]

We propose legislative changes that the government should bring forward on social media companies’ liability for illegal content online, and an electoral ofence of intimidating Parliamentary candidates and party campaigners. Political parties must also put in place measures for more efective joint working to combat intimidation in advance of the next general election. In the long term, prevention will be more efective and important than any individual sanction. Those in public life must adopt a more healthy public discourse and must stand together to oppose behaviour which threatens the integrity of public life.[3]

The report endorses an attack on the weak protections in the E-Commerce Directive protecting hosts:

In the fast-paced and rapidly developing world of social media, the companies themselves and government must both proactively address the issue of intimidation online. Not enough has been done. The Committee is deeply concerned about the limited engagement of the social media companies in tackling these issues.

Currently, social media companies do not have liability for the content on their sites, even where that content is illegal. This is largely due to the EU E-Commerce Directive (2000), which treats the social media companies as ‘hosts’ of online content. It is clear, however, that this legislation is out of date. Facebook, Twitter and Google are not simply platforms for the content that others post; they play a role in shaping what users see. We understand that they do not consider themselves as publishers, responsible for reviewing and editing everything that others post on their sites. But with developments in technology, the time has come for the companies to take more responsibility for illegal material that appears on their platforms.[4]

Relevant recommendations

Recommendation [5] Responsibility Timeframe
Government should bring forward legislation to shift the liability of illegal content online towards social media companies. Government On Exiting the EU
Social media companies must develop and implement automated techniques to identify intimidatory content posted on their platforms. They should use this technology to ensure intimidatory content is taken down as soon as possible. Social media companies Immediately
Social media companies must do more to prevent users being inundated with hostile messages on their platforms, and to support users who become victims of this behaviour. Social media companies Immediately
Social media companies must implement tools to enhance the ability of users to tackle online intimidation through user options. Social media companies Immediately
All social media companies must ensure they are able to make decisions quickly and consistently on the takedown of intimidatory content online. Social media companies Immediately
Twitter, Facebook and Google must publish UK-level performance data on the number of reports they receive, the percentage of reported content that is taken down, and the time it takes to take down that content, on at least a quarterly basis. Social media companies At least every quarter, beginning in the frst quarter of 2018
Social media companies must urgently revise their tools for users to escalate any reports of potential illegal online activity to the police. Social media companies Immediately
The social media companies should work with the government to establish a ‘pop-up’ social media reporting team for election campaigns. Social media companies Before the next general election
Social media companies should actively provide advice, guidance and support to Parliamentary candidates on steps they can take to remain safe and secure while using their sites. Social media companies Before the next general election
The government should consult on the introduction of a new ofence in electoral law of intimidating Parliamentary candidates and party campaigners Government Within one year

References