The Moral Threshold of AI Companionship and Desired Governance: Reflections from the UK Public

Research output: Working paperPreprint

Abstract

There has been a sudden and rapid rise of provision and usage of Artificial Intelligence (AI) companions with scope for deep extension into personal and social life. This makes it important to understand the public's views on the perceived benefits of AI companions, ethical concerns, and governance preferences. Based on a national demographically representative survey of UK adults (n = 2073), the study finds that while the UK public acknowledges the potential of AI companions in areas such as education, convenience, entertainment, and reducing loneliness, there is significant concern about emotional dependency, deception through some forms of anthropomorphism, and risks for children. The research highlights generational differences in AI acceptance, with younger demographics showing greater familiarity and openness toward AI companionship, whereas older groups express stronger concerns over its societal impact. The findings further reveal a moral threshold for the acceptability of anthropomorphic design features in AI companionship, particularly where ontological uncertainty arises-when users are unsure about whether AI systems possess agency or emotional depth. In terms of governance of AI companions, most respondents favour regulation of AI companions, with regulation seen as necessary alongside potential co-regulation frameworks. These findings contribute to ongoing discussions about the governance of AI companions, emphasising the need for what we call 'attentive pragmatism' to balance technological advancements, novel design features, media literacy, and ethical safeguards.
Original languageEnglish
PublisherSocial Science Research Network (SSRN)
DOIs
Publication statusSubmitted - 13 Mar 2025

Fingerprint

Dive into the research topics of 'The Moral Threshold of AI Companionship and Desired Governance: Reflections from the UK Public'. Together they form a unique fingerprint.

Cite this