Promoting Digital Responsibility through Independent Evaluation: Lessons from the first reporting cycle of the Internet Commission

In this article, LSE Visiting Fellow Jonny Shipp examines the key lessons learned from the Commission’s first report on accountability and outlines how its work relates to the UN’s Sustainable Development Goals.

In January, the Internet Commission published the results of its first, year-round independent review of how organizations make decisions about content, contact, and behavior online. La Response Report 1.0 offers insights into the current state of affairs, shedding new light on how an organization’s day-to-day operations relate to its broader corporate goal. It may be the earliest example of procedural responsibility for content moderation practices: an approach that can support global digital cooperation and emerging legal regulation.

In early June, CEOs of many of the world’s leading technology companies joined the ‘Digital with Intentional Movement‘and signed a pledge to governments and policy makers to accelerate the implementation of the Paris Agreement and the UN Sustainable Development Goals. They hope to catalyze collective action across the industry to create a “race to the top” in digital responsibility and ethical business practices, thus reversing the negative consequences of digitalization.

Is the Internet reliable?

At the end of 2017, on the day, the UK Government released its first proposals for what is now the Internet Security Act, I led a roundtable discussion to explore the idea of ​​an “Internet Commission”. A series of interested workshops followed explore the scope of “digital responsibility” and develop a new accountability process in support of an effective Internet regulatory ecosystem. Between public concern about the operation and negative impacts of social media platforms, most agreed that industry was no longer allowed to “mark its own homework” and that a a new wave of corporate responsibility was required.

Promising access is for regulators focus on the organizational processes and procedures surrounding content moderation decisions. Inspired by this, the Internet Commission published its first Evaluation Framework in 2019, and in 2020 it had the opportunity to collect data with a first cohort of “Reporting Partners”: the BBC (broadcast), Sony Playstation (online gaming), Popjam (social media), and Meetic and Tinder (online appointment).

Each Report Partner sent written answers to the questions in our assessment framework, and written explanations based on our first review of the data. We then conducted interviews and developed a detailed confidential case study. Each organization commented on its outline, and through discussion we arrived at a final confidential case study for each participant. We identified key practices and used an model of organizational maturity to investigate and test the compatibility of these practices with the stated goals and objective of the organization. We then agreed to draft each of the five cases, then combine them as a basis for a private knowledge-sharing workshop. Here, participants exchanged views on common challenges, such as design security, appeal rights, moderator welfare, understanding of emerging issues, and the opportunities and limitations of content moderation and age-assurance technologies. This formed the published report, which was examined by a group of nine Council members, balanced in number across civil society, academia and industry. To ensure the independence of the report, they were given full access to the index and opportunities to discuss it in detail with the authors.

Learning about digital responsibility

First, here are some lessons on the reporting process of the Internet Commission. Reporting Partners told us that we were asking the right questions, but we also had many more: our supplementary questions served as a useful starting point for repeating our assessment framework. Our detailed, confidential case studies have helped organizations better understand where they are now, and this part of the process has prompted some immediate changes. Participants appreciated and benefited from private communication. And although it is difficult, we have shown that it is possible to produce a fair and independent public report while respecting the need for business confidence.

Second, here’s what we learned about how to do things in the organizations we studied. We identified 24 key practices and assessed their compatibility with a culture of digital responsibility. Despite the diversity of the cohort, we identified eight common challenges: security by design, moderation welfare, right of appeal, reporting, customer focus, understanding emerging issues, moderation technologies, and age guarantee technologies. Here are four brief examples of the practices we evaluated:

  • We saw how Popjam, a small company that was recently acquired by Epic Games, treated online child safety as part of its service design, using its team’s long experience with younger audiences to balance the risks and benefits of private messaging. They decided not to include this feature because younger users do not see it as necessary, and the risks around child exploitation are great.
  • Sony Playstation is rightly proud of its moderator welfare program: content moderators have a hard time reviewing challenging content and behaviors. An established program supports the emotional needs of moderators, and Sony takes this a step further, using their supply chain influence to ensure that the same psychological support is available to everyone involved, even moderators hired by third-party vendors.
  • The BBC is a unique organization with a strong, if not always consistent, legacy of editorial responsibility. It aims to encourage a healthy public debate, and this is demonstrated in the way its moderation team runs online interaction. They adopt a clear, principle-based policy to understand the intentions of a contributor and mislead their side. Where posts are deleted, affected users are encouraged to use an appeals process.
  • Tinder operates in 196 countries and is the most widely used dating program in many European countries. It makes extensive use of automated tools to review public profiles in near real time. Its system is designed to overestimate the number of violations of its guidelines, creating a broad safety net around the automated moderation process. Importantly, this also helps the organization identify and stay abreast of new abuses and fraud patterns.

The full report, “Accountability Report 1.0. Online Content, Contact and Behavior: Promoting Digital Responsibility” is available here.

The Broader Context: Driving “Digital with a Purpose”

This month, the Portuguese Presidency of the EU launched a future Charter on Digital Rights with the Lisbon Declaration on Digital Rights. It was in this context that CEOs of many of the world’s leading technology companies joined the Digital with Intentional Movement, signing a pledge to governments and policy makers to accelerate the implementation of the Paris Agreement and the UN Sustainable Development Goals. The work of the Internet Commission is important here because while digitization can support the delivery of these goals, the goals can also lead to a more reliable digital development. For example, can we achieve Goal 3, “Good health and well-being” while harmful content multiplies? It is the achievement of Goal 16, “Peace, justice and strong institutions” compatible with the spread of misinformation, data privacy concerns or child exploitation? And Goal 9, on “Industry, Innovation and Infrastructure” must certainly require stronger governance and more work to identify and reduce systemic risks related to the Internet. These issues are central to the evaluation framework of the Internet Commission, an updated version of which was published in March.

The updated evaluation framework incorporates learning from the first reporting cycle, feedback from researchers, regulators and policy makers, and participating organizations, and reflects significant indicators of: Digital Rights Ranking Framework 2020; Volunteer Principles for Combating Internet Child Sexual Exploitation and Abuse; ICO Age Appropriate Design Code; and Council of Europe guidelines on the rights of the child in the digital environment. There are key questions about the scope and purpose of the organization, the people it affects and its management, and additional sections on:

  • Content moderation: how harmful and illegal contact, content or behavior is discovered and acted upon?
  • Automation: How are smart systems used to advertise and / or moderate online content?
  • Safety: What measures are in place to protect the health and well-being of people?

La Digital with Purpose movement has identified digital impacts in five priority areas: Climate Action, Circular Economy, Supply Chain, Digital Inclusion and Digital Trust. It aims for companies to be evaluated and awarded a formal certificate, which will be measured annually to track performance. In order to accelerate the realization of the UN Sustainable Development Goals, the Internet Commission is contributing to the issue of Digital Trust. By aligning our work, we hope to contribute to a race to the top in digital responsibility. But while transparency has a role to play, it should not become an end in itself, and organizations answering their own questions are not a formula for rebuilding trust.

A second reporting cycle now happens with participants included Twitch, Pearson, Meetic and Tinder. Through this work, the Internet Commission identifies and independently assesses how ethical behaviors are embedded in an organizational culture through specific processes and practices, thus promoting digital accountability and contributing to the movement for Digital with a Purpose.

This article represents the opinions of the author, and not the position of the Media @ LSE blog, nor of the London School of Economics and Political Science.

Leave a Reply

Your email address will not be published.