Standard Standard

‘This time with feeling?’ Assessing EU data governance implications of out of home appraisal based emotional AI. / McStay, Andrew; Urquhart, Lachlan.
In: First Monday, Vol. 24, No. 10, 07.10.2019.

Research output: Contribution to journalArticlepeer-review

HarvardHarvard

APA

CBE

MLA

VancouverVancouver

Author

RIS

TY - JOUR

T1 - ‘This time with feeling?’ Assessing EU data governance implications of out of home appraisal based emotional AI

AU - McStay, Andrew

AU - Urquhart, Lachlan

N1 - We thank the ESRC for support through our grant: ES/S013008/1 Retaining full rights, including translation and reproduction rights. Authors may use the statement: © Author 2016 All Rights Reserved. Authors may choose to use their own wording to reserve copyright.

PY - 2019/10/7

Y1 - 2019/10/7

N2 - The boundaries of personal space and borders of bodily integrity are being tested by deployments of emotional artificial intelligence (EAI) in private and public spaces. By means of sensing, seeing and machine learning of facial expressions, voice, gaze, gestures and range of physiological signals (heart rate, skin conductivity and temperature, muscle activity, body temperature, respiration and other bio-signals), the goal is to make interior emotional life machine-readable for personal, commercial and security objectives.In this paper, we focus on computer vision and face-based analytics to consider the nature, method and development of facial coding, the potential demise of existing approaches, and the rise of even more invasive methods. Criticisms of facial coding have long existed, but recent scholarship and industrial development signals a lack of confidence in ‘basic emotions’ and a turn to appraisal-based accounts of emotion. This inevitably entails use of data about internal physiological and experiential contexts, but also factors external to an individual. To explore this, this paper asks and answers the following question: With regard to deployment in out-of-home situations, what are the legal and privacy implications of appraisal-based emotion capture?

AB - The boundaries of personal space and borders of bodily integrity are being tested by deployments of emotional artificial intelligence (EAI) in private and public spaces. By means of sensing, seeing and machine learning of facial expressions, voice, gaze, gestures and range of physiological signals (heart rate, skin conductivity and temperature, muscle activity, body temperature, respiration and other bio-signals), the goal is to make interior emotional life machine-readable for personal, commercial and security objectives.In this paper, we focus on computer vision and face-based analytics to consider the nature, method and development of facial coding, the potential demise of existing approaches, and the rise of even more invasive methods. Criticisms of facial coding have long existed, but recent scholarship and industrial development signals a lack of confidence in ‘basic emotions’ and a turn to appraisal-based accounts of emotion. This inevitably entails use of data about internal physiological and experiential contexts, but also factors external to an individual. To explore this, this paper asks and answers the following question: With regard to deployment in out-of-home situations, what are the legal and privacy implications of appraisal-based emotion capture?

U2 - 10.5210/fm.v24i10.9457

DO - 10.5210/fm.v24i10.9457

M3 - Article

VL - 24

JO - First Monday

JF - First Monday

IS - 10

ER -