StandardStandard

Misinformation and trusted voices: Addressing false information online via provision of authoritative information: Why dialling down emotion is part of the answer. / Bakir, Vian; McStay, Andrew.
7 t. UK Parliament. 2022Written evidence invited by Commons Select Committee.

Allbwn ymchwil: Cyfraniad arallCyfraniad Arall

HarvardHarvard

APA

CBE

MLA

VancouverVancouver

Author

RIS

TY - GEN

T1 - Misinformation and trusted voices

T2 - Addressing false information online via provision of authoritative information: Why dialling down emotion is part of the answer

AU - Bakir, Vian

AU - McStay, Andrew

PY - 2022/10/12

Y1 - 2022/10/12

N2 - Invited Submission by Vian Bakir & Andrew McStay to DCMS Online Harms and Disinformation Sub-Committee Inquiry into Misinformation and Trusted Voices. alse information proliferates online, despite years of multi-stakeholder efforts to quell it. In September 2022, Vian Bakir (Prof. of Journalism & PolComms, SHiLSS) and Andrew McStay (Prof. of Digital Life, SHiLSS) were invited by the UK Parliament’s Online Harms and Disinformation Sub-Committee to provide evidence to their Inquiry into Misinformation and Trusted Voices. They addressed one of the Inquiry’s questions: namely: Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? Now published, one of their conclusions is that rather than having to make difficult content moderation decisions about what is true and false on the fly and at scale, it may be better to ensure that digital platforms’ algorithms optimise emotions for social good rather than just for the platform and its advertisers’ profit. What this social good optimisation would look like is worthy of further study, but they posit that this would likely involve dialling down the platform’s emotional contagion, and engagement, of users.

AB - Invited Submission by Vian Bakir & Andrew McStay to DCMS Online Harms and Disinformation Sub-Committee Inquiry into Misinformation and Trusted Voices. alse information proliferates online, despite years of multi-stakeholder efforts to quell it. In September 2022, Vian Bakir (Prof. of Journalism & PolComms, SHiLSS) and Andrew McStay (Prof. of Digital Life, SHiLSS) were invited by the UK Parliament’s Online Harms and Disinformation Sub-Committee to provide evidence to their Inquiry into Misinformation and Trusted Voices. They addressed one of the Inquiry’s questions: namely: Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? Now published, one of their conclusions is that rather than having to make difficult content moderation decisions about what is true and false on the fly and at scale, it may be better to ensure that digital platforms’ algorithms optimise emotions for social good rather than just for the platform and its advertisers’ profit. What this social good optimisation would look like is worthy of further study, but they posit that this would likely involve dialling down the platform’s emotional contagion, and engagement, of users.

KW - emotional AI

KW - false information

M3 - Other contribution

PB - UK Parliament

ER -