Chapter 9 A deeper look

Introduction

During the coronavirus pandemic, the term ‘infodemic’ came into use. It applied to the provision of excessive quantities of information about the outbreak; and to misleading communications. The latter tendency was particular problematic. It led to members of the public endangering their health, through refusing care, or using fake treatments. It also encouraged conspiracy theories, leading in some instances to attempts to sabotage infrastructure or attack people working in the emergency response. Sources of misinformation included hostile foreign states and groups; senior world figures such as the then-US President, Donald Trump; people motivated by financial gain; and not malicious but mistaken individuals. There were problems of mistrust of public authorities, including among some Black and Asian Minority Ethnic (BAME) groups (for discussion of ethnicity, see: chapter 10). Some people were reluctant to attend hospital or to receive vaccinations. Others used damaging home-made false treatments. There was evidence of related hate crimes against Asians, who were the target for false claims about the spread of the virus.

At the centre of many of these problems were the Internet and social media. They were a principal means by which misinformation disseminated, augmented by the algorithms they employed. For instance, a false documentary disseminated via YouTube achieved huge reach, with damage on a scale that made it difficult to counteract. The social media platforms themselves took some action to correct the record, label material as problematic, or remove it altogether. Eventually even Tweets by Trump had warnings attached to them. However, there was criticism of inconsistency and a lack of transparency in their approach, and of their not being swift enough to respond to the challenge of the ‘infodemic’.

Beyond self-regulation by these platforms, there were independent fact checking organisations and reliable contributions by public service broadcasters. The UK government operated a Counter-Disinformation Unit and a Rapid Response Unit, helping it to track developments, issue denials of false information, and where necessary work with platforms to take down false content. However, in a 2020 report, the House of Commons Digital, Culture, Media and Sport Committee (2020: 3) concluded that the existing regulatory environment was inadequate to meet the challenge. It wrote that:

‘Whilst tech companies have introduced new ways of tackling misinformation through the introduction of warning labels and tools to correct the record, these innovations have been applied inconsistently, particularly in the case of high-profile accounts. Platform policies have been also been too slow to adapt, while automated content moderation at the expense of human review and user reporting has had limited effectiveness. The business models of tech companies themselves disincentivise action against misinformation while affording opportunities to bad actors to monetise misleading content. At least until well-drafted, robust legislation is brought forward, the public is reliant on the goodwill of tech companies, or the bad press they attract, to compel them to act.’

Diving Deeper

Communications and democracy

Communications are vital to the functioning of a democracy. They are a means by which people organise politically; and express preferences to and receive information from those who govern on their behalf in a representative democracy (see: chapter 1). In a democracy, diversity and freedom of expression in communications are essential. It is necessary for differing and competing viewpoints and perspectives to be available. They enable the presentation of a meaningful range of options to the public, and for those who hold office to be held to account. But free expression is always open to abuse, or the perception that it is being abused. Some groups or individuals may unintentionally or deliberately mislead others in the pursuit of their objectives, impacting negatively on the standard of public debate. They may include those within or outside a given state who intend to undermine the democratic system itself; or those seeking material gain regardless of the negative social consequences of their actions. Furthermore, some observers of and participants in politics may argue that the system of communications is structured in such a way as to provide undue attention to problematic viewpoints. Yet intervention to rectify possible perceived defects in the operation of communications is likely to meet with objections about inappropriate interference.

Technology and politics

The experience of misinformation and the pandemic suggests that the means – or media – by which communications takes place are not wholly neutral. The form they take influences the outcome, determining the types of information that are received by the public. The Internet is a technology that provides for an enormous diversity of content, received by a substantial portion of the population. It was suited to rapid dissemination of public health information. But it was also a suitable vehicle for irresponsible material. The Internet is difficult to regulate in a way that both adheres to basic principles of freedom of expression, while eliminating serious cases of misinformation.

The role of the public and private sectors in politics

During the pandemic, broadcasters that were either publicly owned or private corporations but required by law to provide public service, performed an essential role in disseminating information intended to keep the public informed about developments and protect their wellbeing. Other entities – such as social media platforms – made a more ambiguous contribution. They were able to help supply useful communications. But they were also conduits for misinformation, for which their highly profitable business models allowed. Indeed, not only did the firms profit from misinformation, but financial incentives were provided for those who were sources of the material. While such corporations did take steps to counteract these problems, some felt they could do more, and were resistant to the idea of more effective regulation. This controversy illustrates the point that public institutions alone are not sufficient to sustain a democratic political system. Private sector organisations, and indeed individuals, can undermine or support the system through their particular actions, or indeed their failure to act. In the case of the technology corporations, a further challenge arose because of their multinational nature, to some extent placing them above and beyond the more normal methods of state-level regulation and accountability.

Summary

The pandemic that began in 2020 was accompanied by the so-called ‘infodemic’.

It involved misinformation leading to people refusing treatment, self-harming, targeting certain groups, and acting on conspiracy theories. It came from a range of sources, within and without the UK, with a variety of motives.

Social media companies lay at the centre of the ‘infodemic’. While they made some efforts to self-regulate, there was criticism of the shortcomings. Other attempts to counteract misinformation came from fact-checkers and the government.

Bodies including the House of Commons Digital, Media and Sport Committee called for legislation to enforce more effective regulation.

Test your knowledge


What does the ‘infodemic’ reveal about the significance of the Internet to UK politics?

  • The Internet has swiftly become a principal source of information in the UK as elsewhere.
  • It can be a vehicle for malign forces; but for the foreseeable future, it will continue to operate.
  • Determining the steps to take to avoid the problems associated with the Internet and social media has become a subject of prominent public controversy.
  • There are doubts that self-regulation will be sufficient, and some argue that politicians and political institutions will need to act to regulate the sector.

What means might there be of counteracting online misinformation?

  • Self-regulation by technology companies, either using technology itself or reporting by humans.
  • Civil society initiatives such as fact-checking.
  • Government action, working with the sector, or actual legislative intervention.
  • More positive action, such as the use of quality journalism and public service broadcasting.

References

House of Commons Digital, Media, Culture and Sport Committee. 2020. Misinformation in the COVID-19 Infodemic, House of Commons: London.

Back to top

Printed from , all rights reserved. © Oxford University Press, 2024