Regulating for Responsible Technology

Regulating for Responsible Technology is a report by DotEveryone, scoping their favoured options for social media Internet regulation, in the context of the UK governments Internet Safety Strategy.[1]

Main recommendations

The report makes three key recommendations, for an Office for Responsible Technology.

Establish a new Office for Responsible Technology with three functions:

  1. Empower regulators. The Office sits above existing regulators, identifies the gaps in regulation and supports regulators with the expertise and foresight to respond to digital technologies as they affect their sectors.
  2. Inform the public and policymakers. The Office creates an authoritative body of evidence about the benefits and harms of technologies to underpin the work of regulators, builds public awareness, and engages all parts of society to create consensus around a future vision for technology to underpin the regulatory system.
  3. Support people to find redress. The Office ensures the public can hold technologies to account for individual and collective harms derived from their use, and mediates unresolved disputes.[2]

Recommendation 1: Empower regulators

  1. Identify where issues fall between regulators and recommend new remits and powers.
  2. Build regulators’ digital capabilities and promote knowledge transfer with industry.
  3. Lead foresight activities to anticipate opportunities and challenges of digital technologies.[3]

Recommendation 2: Inform the public and policymakers

  1. Understand and articulate the values that underpin regulation through engagement with all parts of society.
  2. Commission and conduct research into the benefits and harms of technologies to inform regulators and policymakers.
  3. Provide clear, understandable information and guidance to the public.[4]

Recommendation 3: Support people to find redress

  1. Set best practice for handling public complaints about impacts of technologies and audit how companies behave.
  2. Provide backstop mediation.
  3. Share insights to flag emerging issues and inform regulatory practice.[5]


Addressing regulatory weaknesses

The report argues that weaknesses within government and regulatory structures need to be addressed.

Issues fall through the gaps between regulators. Ofcom has pointed out the failure to regulate content hosted on social media – but without direction from Parliament it cannot step outside its existing remit to address this. Other areas such as online political campaigning and targeted advertising also fall in the grey areas between existing bodies.

  • Regulators lack expertise and resources. Even the Information Commissioner’s Office (ICO) – the leading regulator in this area – significantly lags behind the tech sector. Recent changes to ICO hiring policies are starting to address this. But there remains a major imbalance between industry and regulators.
  • Regulators react too late. Most regulators look backwards not forwards. The Electoral Commission reports on the impacts of digital advertising on the referendum and election campaigns, for example, came months after the votes, with the outcomes already decided. The Competition and Markets Authority has only recently set up a data unit and announced a review of modern consumer markets, but that’s after the tech sector has already outgrown all others in market capitalisation. A recent survey of businesses found 92% expect a negative impact if sectoral regulators don’t adapt to disruptive change.
  • Societal impacts are out of scope. There’s a focus on the protection of individuals, such as the data protection rights conferred through GDPR. But there’s little consideration of broader social impacts such as algorithmic discrimination, where both the Alan Turing Institute and the Information Commissioner have identified the need for strengthened regulation. The CMA’s focus on individual consumer welfare over broader public interest has been identified as a reason it’s failed to respond to the data driven business models of many online services.[6]

For this reason, DE recommend that a regulator be given co-ordination duties across government.

Finance and powers

The Office for Responsible Technology will provide backstop mediation and alternative dispute resolution where other means for redress have failed. This work will be funded by industry, creating an incentive to reduce the number of cases which reach this point …

The Office must have teeth to ensure all parties comply with its decisions. Ombudsmen with too few powers have been criticised as ineffectual and businesses only participate in 6% of cases where their involvement is voluntary.

The potential scale of cases, particularly related to social media content, means there will need to be imaginative approaches to creating a workable mechanism for mediation.[7]

Commentary

Breadth of work of proposed body

There are challenges for a regulator taking on a large range of duties. Risks include lacking focus, being unable to meet expectations, and having contradictory internal aims. There is also a risk, as DE acknowledge, of becoming an "everything regulator", in the case of technology and the Internet. Indeed, the policy space envisaged as falling in scope is all digital technology, covering not just online communications, but also aspects of processing of personal data, use of technology in transportation and other industries, ranging as far as impacts on employment.

Challenges for redress body

There is an obvious issue for any body charged with regulating the results of technology as to whether this in an entirely new issue, or one that is already someone's responsibility, and is neglected for reasons of knowledge, capacity or legislative weakness.

Its two proposed main functions are

  1. Set best practice for handling public complaints about impacts of technologies and audit how companies behave.
  2. Provide backstop mediation.

In the case of setting best practice and a general audit function, this is most easily done across relatively narrow fields of action. The broader the remit, arguably, the harder it will be for a regulator to intervene effectively and usefully, as specialist knowledge will be sacrificed.

The report acknowledges this tension, and proposes that complaints handling may be devolved elsewhere, if systems exist. It leaves an open question as to who exactly should take on these roles for each sector, but provides that a complaints body should always exist.[8]

The report anticipates that the majority of complaints would be handled by the companies themselves in the first instance:

The Office for Responsible Technology will set best practice for how digital technology services handle complaints. It will audit this through twice yearly reporting on complaints handling and with spot checks on individual cases. The Office will rate these processes and organisations will have to display this rating prominently. There will be the power to sanction companies that consistently and seriously fail to provide adequate redress.[9]

This is a reasonable approach for any external system of regulation, placing the main burden on audit rather than dealing with a handful of escalated complaints.

Kinds of mediation proposed

For backstop mediation, however, the challenges that arise are immediately very significant. This proposal is the closest to delivering the kind of policy initiative that the government is seeking in for its White Paper, that is to say regulation of social media content, so deserves some attention and examination.

The report mentions a number of different areas where mediation might be proposed. For example:

Money Saving Expert Martin Lewis is a high profile example of this frustration and has launched court proceedings against Facebook over fake adverts that use his image after failing to get a satisfactory response.[10]

This kind of case illustrates the care and attention that must be applied to any mediation system. The complaint has three parties: Martin Lewis; the advertiser; and the platform. In these kinds of cases it is not immediately obvious that the complaint should or can be resolved by the platform, as the party complained about may have a different view. Thus the process must be designed to reflect the interests and rights of each party. It should not be assumed that the party hosting the content is capable of assessing a complaint. Rather, any mediation or complaints process must allow evidence to be presented and independently assessed.

In practice, mediation processes are best designed for situations where all parties are interested in a fair conclusion, such as a contractual dispute between economic actors, where a settlement is the most important result. Where the desire for resolution is less clear, procedures are less about mediation than resolution.

There is some danger that the proposal is assuming that most disputes are between the platform and user, where resolution is in the interests of both, as the platform at least is generally disinterested in the content as such. This is not as likely to be the case when the dispute is in fact between two or more users on the platform.

An additional complexity arises from the current underlying liability regime. if the platform, as at present, becomes liable once a party informs them, then the presumption will be that the complainant will get their way, and content will be removed. This aspect needs to be addressed in future work from DotEveryone, if the proposal is to be fully rounded.

Types of complaint the body may deal with

A weakness of the proposal is the breadth of complaints it might have to handle. The report highlights a case study of "accountable algorithms" and discusses financial consequences of automated decisions:

when, for example, an individual fails to get redress from their mortgage broker because they are paying more interest than their neighbours, they could escalate their complaint to the Office for Responsible Technology. The Office could then assess the handling of the dispute and run an independent algorithmic audit to see if their claims of racial bias are substantiated.[11]

These are important observations, but it illustrates the underlying tension, that technologies are embedded in institutions that are governed by sectoral laws govern whether certain practices are lawful or not, such as financial regulations and competition law, and then by broader frameworks like data protection for specific questions such as the processing of personal data. It is unclear how effectively a cross-cutting regulator could intervene directly for complaints handling of this nature.

Appropriateness of a state regulator dealing with free expression

The question of whether it is appropriate for a state regulator to be placed in the role of the arbiter of free expression in private platforms is not addressed in the report.

By targeting the better enforcement of community guidelines and terms and conditions, a mediator would be de facto ruling on content not on the basis of human rights standards, but on the basis of the contract between parties using a platform. While this may produce fairer and more accurate results, it would also mean that a state body would be adjudicating on and restricting otherwise legal expression.

This creates a general loophole for governments and social actors to pressurise platforms into tightening terms and conditions, to rule increasing amounts of speech as unacceptable, and then provide for efficient state enforcement of these restrictions. While this is not the intention of the proposal, it could easily be the result.

Furthermore, community guidelines are themselves often arbitrary and fail to be implemented on the basis of principle. In order for a regulator to be effective and fair, it would need to engage in the rules being set by a platform, and make recommendations in order to refine these guidelines, respecting the audiences and business provision that is chosen by the platform, for instance being appropriate for children, teenagers or adults, or for a particular community or activity. These are not matters that ought to be the business of a government regulator, but are necessary for effective independent regulation of social media.

In other cases, it may be adjudicating on unlawful behaviour, restricting content and activity rather than seeking redress for individuals.

Conclusion

The role envisages for a cross-cutting body to develop policy, understand developments and aid regulators as technology impacts their work is good proposal with a great deal of merit. It would aid the effectiveness of government and technology policy.

This however does not necessarily have to sit with the envisaged complaints role. Customer and consumer complaints of many types have differing mechanisms and regulatory frameworks that are likely to be more decisive than the technology factor in achieving redress.

When applied to social media complaints, there are significant concerns about the potential for any regulator to restrict free expression beyond what the law demands, especially if it is state backed. Additionally, the challenge for independent regulation is likely to be quite specific to particular companies, around very specific business and community expectations that may not even relate to a notion of harm, needing a level of dialogue that is not appropriate for a state body. Furthermore dispute resolution and mediation are unlikely to be the right model for many disputes on social media.

Proposals in this area also need to take account of the international nature of social media companies. Independent self-regulation of social media should be considered as a means to bridge these tensions.

References

  1. , Miller C, Ohrvik-Stott J, Coldicutt R. (2018) Regulating for Responsible Technology: Capacity, Evidence and Redress: a new system for a fairer future. London: Doteveryone.
  2. Ibid, p6
  3. Ibid, p6
  4. Ibid, p6
  5. Ibid, p6
  6. Ibid, p12
  7. Ibid, p23
  8. "Based on the Office’s review of regulators’ powers and capacity, this function may be devolved to the existing or future ombudsmen associated with particular sectors, or it may be more effective to keep this as an independent function within the Office. The Office will lead regular reviews of this new body’s performance to ensure they are delivering the redress the public needs." Ibid, p22
  9. Ibid, p23
  10. Ibid, p22
  11. Ibid, p 24