Ethics and Empathy-Based Human-AI Partnering: Exploring the Extent to which Cultural Differences Matter When Developing an Ethical Technical Standard

Andrew McStay, Fredric Andres, Vian Bakir, Ben Bland, Alex Laffer, Phoebe Li, Sumiko Shimo

Research output: Other contributionpeer-review

Abstract

This white paper reports insights derived from two workshops held in Tokyo in June 2024, conducted to explore Japanese and regional insights on ethical questions regarding the use of technology to emulate empathy, general-purpose artificial intelligence (GPAI), and human-AI partnering. Key recommendations from the workshop are as follows: 1) To ensure ethical interactions, there is a need to distinguish strong empathy from weak empathy in AI-based partners. 2) There is a need to investigate, be aware, and potentially be self-critical of one’s own ethical position and assumptions on technology. This is not to suggest that one nation is wrong or right, but that it is valuable to understand one’s own orienting beliefs before ethically assessing technology. 3) There is a need to recognize that empathy and emotion are seen in broader “ecological” terms, meaning that they are present in agents other than people. One does not have to agree, but governance should be aware that design and reception of design involve feeling-into things (and even feeling-into things by things). 4) If IEEE P7014.1 is correct and there will be more rather than less empathy-based interaction, there is a need to consider empathy fatigue.
Original languageEnglish
TypeWhite paper
PublisherIEEE
Publication statusPublished - 28 Aug 2024

Fingerprint

Dive into the research topics of 'Ethics and Empathy-Based Human-AI Partnering: Exploring the Extent to which Cultural Differences Matter When Developing an Ethical Technical Standard'. Together they form a unique fingerprint.

Cite this