Emulated Empathy: Can Risks be Countered by a Soft-law Standard?

Research output: Contribution to journalArticlepeer-review

53 Downloads (Pure)

Abstract

As artificial intelligence systems increasingly emulate empathy – recognizing, interpreting, responding to human emotional states and psychological contexts, and potentially appearing to genuinely care about a person – questions emerge about the personal and societal implications of these developments. Emulated empathy may enhance usability, engagement, and accessibility, but it also raises concerns about manipulation, commodification of interior life, and detachment from reality. Such issues regarding emulated empathy have been raised in relation to AI companions. While AI systems may arguably possess functional aspects of empathy, mimicry of human empathy also reveals fundamental differences between human and computational forms. Drawing on insights from neuroscience, philosophy of mind, and AI ethics, the paper discusses whether such systems pose a threat to human connection or could instead augment it. Special attention is given to the IEEE P7014.1standard, which outlines ethical considerations and recommended practices for human-AI partnerships involving emulated empathy. In additional to advancing conceptual understanding of emulated empathy, the paper argues for a proactive governance approach that combines soft law with regulatory safeguards to mitigate harm, uphold trust, and guide responsible design in this emerging domain.
Original languageEnglish
Pages (from-to)1-7
Journalieee transactions on technology and society
DOIs
Publication statusPublished - 10 Jun 2025

Fingerprint

Dive into the research topics of 'Emulated Empathy: Can Risks be Countered by a Soft-law Standard?'. Together they form a unique fingerprint.

Cite this