Misinformation and trusted voices: Addressing false information online via provision of authoritative information: Why dialling down emotion is part of the answer

Research output: Other contribution

82 Downloads (Pure)

Abstract

Invited Submission by Vian Bakir & Andrew McStay to DCMS Online Harms and Disinformation Sub-Committee Inquiry into Misinformation and Trusted Voices. 

alse information proliferates online, despite years of multi-stakeholder efforts to quell it. In September 2022, Vian Bakir (Prof. of Journalism & PolComms, SHiLSS) and Andrew McStay (Prof. of Digital Life, SHiLSS) were invited by the UK Parliament’s Online Harms and Disinformation Sub-Committee to provide evidence to their Inquiry into Misinformation and Trusted Voices. They addressed one of the Inquiry’s questions: namely: Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? Now published, one of their conclusions is that rather than having to make difficult content moderation decisions about what is true and false on the fly and at scale, it may be better to ensure that digital platforms’ algorithms optimise emotions for social good rather than just for the platform and its advertisers’ profit. What this social good optimisation would look like is worthy of further study, but they posit that this would likely involve dialling down the platform’s emotional contagion, and engagement, of users.
Original languageEnglish
TypeWritten evidence invited by Commons Select Committee
Media of outputText
PublisherUK Parliament
Number of pages7
Publication statusPublished - 12 Oct 2022

Keywords

  • emotional AI
  • false information

Fingerprint

Dive into the research topics of 'Misinformation and trusted voices: Addressing false information online via provision of authoritative information: Why dialling down emotion is part of the answer'. Together they form a unique fingerprint.

Cite this