On manipulation by emotional AI: UK adults’ views and governance implications
Allbwn ymchwil: Cyfraniad at gyfnodolyn › Erthygl › adolygiad gan gymheiriaid
Fersiynau electronig
Dogfennau
- On manipulation by emotional AI
Fersiwn derfynol wedi’i chyhoeddi, 2.68 MB, dogfen-PDF
Trwydded: CC BY Dangos trwydded
Dangosydd eitem ddigidol (DOI)
With growing commercial, regulatory and scholarly interest in use of Artificial Intelligence (AI) to profile and interact with human emotion (“emotional AI”), attention is turning to its capacity for manipulating people, relating to factors impacting on a person’s decisions and behavior. Given prior social disquiet about AI and profiling technologies, surprisingly little is known on people’s views on the benefits and harms of emotional AI technologies, especially their capacity for manipulation. This matters because regulators of AI (such as in the European Union and the UK) wish to stimulate AI innovation, minimize harms and build public trust in these systems, but to do so they should understand the public’s expectations. Addressing this, we ascertain UK adults’ perspectives on the potential of emotional AI technologies for manipulating people through a two-stage study. Stage One (the qualitative phase) uses design fiction principles to generate adequate understanding and informed discussion in 10 focus groups with diverse participants (n = 46) on how emotional AI technologies may be used in a range of mundane, everyday settings. The focus groups primarily flagged concerns about manipulation in two settings: emotion profiling in social media (involving deepfakes, false information and conspiracy theories), and emotion profiling in child oriented “emotoys” (where the toy responds to the child’s facial and verbal expressions). In both these settings, participants express concerns that emotion profiling covertly exploits users’ cognitive or affective weaknesses and vulnerabilities; additionally, in the social media setting, participants express concerns that emotion profiling damages people’s capacity for rational thought and action. To explore these insights at a larger scale, Stage Two (the quantitative phase), conducts a UK-wide, demographically representative national survey (n = 2,068) on attitudes toward emotional AI. Taking care to avoid leading and dystopian framings of emotional AI, we find that large majorities express concern about the potential for being manipulated through social media and emotoys. In addition to signaling need for civic protections and practical means of ensuring trust in emerging technologies, the research also leads us to provide a policy-friendly subdivision of what is meant by manipulation through emotional AI and related technologies
Iaith wreiddiol | Saesneg |
---|---|
Rhif yr erthygl | 1339834 |
Cyfnodolyn | Frontiers in Sociology |
Cyfrol | 9 |
Dynodwyr Gwrthrych Digidol (DOIs) | |
Statws | Cyhoeddwyd - 7 Meh 2024 |
Cyfanswm lawlrlwytho
Nid oes data ar gael