Digital Platform Governance and the Challenges for Trust and Safety: Part 1, 2 and 3

Date
Summary
"There is growing evidence that harms to human rights, both directly online and with offline impact, are associated with gaps in the wider existing national and international regulatory regimes within which these companies operate."
This three-part series of reports - developed by Research ICT Africa, a digital policy, regulation, and governance think tank based in South Africa - looks at the evidence around policies and practices governing digital platforms, highlighting some of the challenges and possible solutions to internet governance. The series was produced as evidence-based input to the consultative processes of the United Nations Educational, Scientific and Cultural Organization (UNESCO)'s project titled "Guidance for regulating digital platforms: a multi-stakeholder approach". It involves multi-stakeholder consultations to create guidelines to support regulators, governments, legislatures, and companies, dealing with content that potentially damages human rights and democracy while protecting freedom of expression and the availability of accurate and reliable information. (See Related Summaries below for the latest Draft Guidelines developed as part of this process.)
The evidence reviewed in this brief rests on work by the academic, civil society, and journalistic communities, as well as on documents from the platforms themselves. More than 800 documents, mainly published between 2020 and 2022, were identified and assessed with a view to current debates about regulatory frameworks. All three documents conclude with a list of recommendations related to the particular issue being examined.
The three parts are as follows (click on the headings to access each report in PDF format):
Part 1: Why Lies and Hatred Proliferate on Digital Platforms - This first report tackles the what and the why of problems in platform content and highlights the following key trends:
This three-part series of reports - developed by Research ICT Africa, a digital policy, regulation, and governance think tank based in South Africa - looks at the evidence around policies and practices governing digital platforms, highlighting some of the challenges and possible solutions to internet governance. The series was produced as evidence-based input to the consultative processes of the United Nations Educational, Scientific and Cultural Organization (UNESCO)'s project titled "Guidance for regulating digital platforms: a multi-stakeholder approach". It involves multi-stakeholder consultations to create guidelines to support regulators, governments, legislatures, and companies, dealing with content that potentially damages human rights and democracy while protecting freedom of expression and the availability of accurate and reliable information. (See Related Summaries below for the latest Draft Guidelines developed as part of this process.)
The evidence reviewed in this brief rests on work by the academic, civil society, and journalistic communities, as well as on documents from the platforms themselves. More than 800 documents, mainly published between 2020 and 2022, were identified and assessed with a view to current debates about regulatory frameworks. All three documents conclude with a list of recommendations related to the particular issue being examined.
The three parts are as follows (click on the headings to access each report in PDF format):
Part 1: Why Lies and Hatred Proliferate on Digital Platforms - This first report tackles the what and the why of problems in platform content and highlights the following key trends:
- Online and platform content that may cause harm through the breach of human rights is sufficiently widespread to have raised concerns about the potentially severe implications for the future of trust, safety, democracy, and sustainable development.
- A certain amount of this content is curbed by the dominant commercial platforms' content moderation mechanisms. Much still escapes their nets and, in the worst cases, is algorithmically amplified and even supported by advertising.
- Some smaller platforms expressly allow hatred and conspiracy theories, even facilitating the organisation of offline attacks on democracy.
- The roots of the problems lie in "attention economics", automated advertising systems, external manipulators, company spending priorities, and stakeholder knowledge deficits.
- Of value in addressing these problems will be the development of guidelines for regulating platforms, centred on safeguarding human rights, promoting transparency, and limiting the business processes and technical mechanisms that underpin potentially harmful content online.
- Guidance emphasising human rights as the appropriate vantage point for assessing problems and regulatory solutions can help entrench these international standards as foundational for any regulatory regimes.
- Any law-based resort to a regulatory "cure" for the platform ills must be structured to avoid worsening the "disease", given that state actors are frequently implicated in contributing to online content that threatens human rights and may overstep their roles regarding the control of online content.
- Guidance can advise that regulating only the largest platforms is insufficient, even if at the same time regulatory regimes should be nuanced in terms of platform size and role, and encryption-enabling privacy should not be compromised in regard to messaging services.
- Platform policies lack clarity about the relationship between them, as well as about how policies should be applied at global and local levels.
- How platforms understand and identify harms is insufficiently mapped to human rights standards, and there is a gap in how policy elements should deal with different rights or with business models when there are tensions.
- Policies are not always transparent and do not provide sufficiently for risk assessment.
- Implementation and enforcement by platforms have serious shortfalls, while attempts to improve outcomes by automating moderation have their limits.
- Inequalities in policy and practice abound in relation to different categories of people, countries, and languages.
- Of value in addressing these problems could be the development of guidance for the governance and regulation of frameworks that sets out suggested standards and parameters for platform policies and related operations.
- Statutory authorities should not seek to take over the direct policy formulation nor the ongoing moderation work by the companies themselves, but they can set objectives, policy standards, and process benchmarks that apply to solo-, self-, and co-regulatory mechanisms that can ensure more effective performance by the platforms themselves.
- It can be strongly encouraged that an array of regulatory arrangements combined can work to get moderation policies and practices properly integrated with international human rights standards, and that these should also spell out how to balance global and local dimensions.
- Guidance could insist that all relevant platform policy documents should be public and in the primary languages where companies avail their services.
- Platform problems are linked to the fact that they are not self-governing according to agreed industry standards but mainly "solo-governing" when it comes to content curation and moderation.
- Reaction to the failure of current platform efforts to regulate content includes the danger of over-regulation by state entities, which carries real risks to freedom of expression.
- The purview of what may need to be part of new regulatory arrangements includes the interplay between policy, practice, business models, and technology.
- There is a pluralism of platforms and other actors in the "tech stack" who have different roles in the online content landscape, with concomitant implications for regulatory arrangements.
- Independent media, whistle-blowers, and civil society organisations are significant factors in pushing platform accountability, but mechanisms of transparency should be considered for regulatory protections and support.
- New technology is raising new challenges for platforms' content moderation.
- Platform policy and practice are especially significant for elections.
- Regulatory arrangements in all forms and by all actors need to mainstream a human rights approach; statutory ones should be articulated with other official policy areas and regulatory bodies and should avoid the range of pitfalls that can erode freedom of expression.
- Statutory law and regulation should encompass not only what should not be done (such as platforms not amplifying illegal content) but also what should happen (for instance, greater transparency, rule-governed moderation process, and independent impact assessments).
- An international guidance framework could provide a common reference point for emerging and fragmented regulatory regimes and for supporting decentralised platform alternatives, such as governmental presence on non-profit and decentralised services like the fediverse.
Source
UNESCO website on June 8 2023. Image credit: Research ICT Africa
- Log in to post comments











































