Electronic versions

Documents

Invited Submission by Vian Bakir & Andrew McStay to DCMS Online Harms and Disinformation Sub-Committee Inquiry into Misinformation and Trusted Voices. 

alse information proliferates online, despite years of multi-stakeholder efforts to quell it. In September 2022, Vian Bakir (Prof. of Journalism & PolComms, SHiLSS) and Andrew McStay (Prof. of Digital Life, SHiLSS) were invited by the UK Parliament’s Online Harms and Disinformation Sub-Committee to provide evidence to their Inquiry into Misinformation and Trusted Voices. They addressed one of the Inquiry’s questions: namely: Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? Now published, one of their conclusions is that rather than having to make difficult content moderation decisions about what is true and false on the fly and at scale, it may be better to ensure that digital platforms’ algorithms optimise emotions for social good rather than just for the platform and its advertisers’ profit. What this social good optimisation would look like is worthy of further study, but they posit that this would likely involve dialling down the platform’s emotional contagion, and engagement, of users.

Keywords

  • emotional AI, false information
Original languageEnglish
TypeWritten evidence invited by Commons Select Committee
Medium of outputText
PublisherUK Parliament
Number of pages7
Publication statusPublished - 12 Oct 2022

Research outputs (1)

View all

Total downloads

No data available
View graph of relations