Ethics and Empathy-Based Human-AI Partnering: Exploring the Extent to which Cultural Differences Matter When Developing an Ethical Technical Standard
Research output: Other contribution › peer-review
Standard Standard
IEEE. 2024, White paper.
Research output: Other contribution › peer-review
HarvardHarvard
APA
CBE
MLA
VancouverVancouver
Author
RIS
TY - GEN
T1 - Ethics and Empathy-Based Human-AI Partnering
T2 - Exploring the Extent to which Cultural Differences Matter When Developing an Ethical Technical Standard
AU - McStay, Andrew
AU - Andres, Fredric
AU - Bakir, Vian
AU - Bland, Ben
AU - Laffer, Alex
AU - Li, Phoebe
AU - Shimo, Sumiko
PY - 2024/8/28
Y1 - 2024/8/28
N2 - This white paper reports insights derived from two workshops held in Tokyo in June 2024, conducted to explore Japanese and regional insights on ethical questions regarding the use of technology to emulate empathy, general-purpose artificial intelligence (GPAI), and human-AI partnering. Key recommendations from the workshop are as follows: 1) To ensure ethical interactions, there is a need to distinguish strong empathy from weak empathy in AI-based partners. 2) There is a need to investigate, be aware, and potentially be self-critical of one’s own ethical position and assumptions on technology. This is not to suggest that one nation is wrong or right, but that it is valuable to understand one’s own orienting beliefs before ethically assessing technology. 3) There is a need to recognize that empathy and emotion are seen in broader “ecological” terms, meaning that they are present in agents other than people. One does not have to agree, but governance should be aware that design and reception of design involve feeling-into things (and even feeling-into things by things). 4) If IEEE P7014.1 is correct and there will be more rather than less empathy-based interaction, there is a need to consider empathy fatigue.
AB - This white paper reports insights derived from two workshops held in Tokyo in June 2024, conducted to explore Japanese and regional insights on ethical questions regarding the use of technology to emulate empathy, general-purpose artificial intelligence (GPAI), and human-AI partnering. Key recommendations from the workshop are as follows: 1) To ensure ethical interactions, there is a need to distinguish strong empathy from weak empathy in AI-based partners. 2) There is a need to investigate, be aware, and potentially be self-critical of one’s own ethical position and assumptions on technology. This is not to suggest that one nation is wrong or right, but that it is valuable to understand one’s own orienting beliefs before ethically assessing technology. 3) There is a need to recognize that empathy and emotion are seen in broader “ecological” terms, meaning that they are present in agents other than people. One does not have to agree, but governance should be aware that design and reception of design involve feeling-into things (and even feeling-into things by things). 4) If IEEE P7014.1 is correct and there will be more rather than less empathy-based interaction, there is a need to consider empathy fatigue.
M3 - Other contribution
PB - IEEE
ER -