Blurring the moral limits of data markets: biometrics, emotion and data dividends

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

StandardStandard

Blurring the moral limits of data markets: biometrics, emotion and data dividends. / Bakir, Vian; Laffer, Alex; McStay, Andrew.
Yn: AI & Society, 12.08.2023.

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

HarvardHarvard

APA

CBE

MLA

VancouverVancouver

Bakir V, Laffer A, McStay A. Blurring the moral limits of data markets: biometrics, emotion and data dividends. AI & Society. 2023 Awst 12. Epub 2023 Awst 12. doi: https://doi.org/10.1007/s00146-023-01739-5

Author

RIS

TY - JOUR

T1 - Blurring the moral limits of data markets: biometrics, emotion and data dividends

AU - Bakir, Vian

AU - Laffer, Alex

AU - McStay, Andrew

N1 - Economic and Social Research Council (ES/T00696X/1) and Innovate UK (TS/T019964/1).

PY - 2023/8/12

Y1 - 2023/8/12

N2 - This paper considers what liberal philosopher Michael Sandel coins the ‘moral limits of markets’ in relation to the idea of paying people for data about their biometrics and emotions. With Sandel arguing that certain aspects of human life (such as our bodies and body parts) should be beyond monetisation and exchange, others argue that emerging technologies such as Personal Information Management Systems can enable a fairer, paid, data exchange between the individual and the organisation, even regarding highly personal data about our bodies and emotions. With the field of data ethics rarely addressing questions of payment, this paper explores normative questions about data dividends. It does so by conducting a UK-wide, demographically representative online survey to quantitatively assess adults’ views on being paid for personal data about their biometrics and emotions via a Personal Information Management System, producing a data dividend, a premise which sees personal data through the prism of markets and property. The paper finds diverse attitudes based on socio-demographic characteristics, the type of personal data sold, and the type of organisation sold to. It argues that (a) Sandel’s argument regarding the moral limits of markets has value in protecting fundamental freedoms of those in society who are arguably least able to (such as the poor); but (b) that contexts of use, in particular, blur moral limits regarding fundamental freedoms and markets.

AB - This paper considers what liberal philosopher Michael Sandel coins the ‘moral limits of markets’ in relation to the idea of paying people for data about their biometrics and emotions. With Sandel arguing that certain aspects of human life (such as our bodies and body parts) should be beyond monetisation and exchange, others argue that emerging technologies such as Personal Information Management Systems can enable a fairer, paid, data exchange between the individual and the organisation, even regarding highly personal data about our bodies and emotions. With the field of data ethics rarely addressing questions of payment, this paper explores normative questions about data dividends. It does so by conducting a UK-wide, demographically representative online survey to quantitatively assess adults’ views on being paid for personal data about their biometrics and emotions via a Personal Information Management System, producing a data dividend, a premise which sees personal data through the prism of markets and property. The paper finds diverse attitudes based on socio-demographic characteristics, the type of personal data sold, and the type of organisation sold to. It argues that (a) Sandel’s argument regarding the moral limits of markets has value in protecting fundamental freedoms of those in society who are arguably least able to (such as the poor); but (b) that contexts of use, in particular, blur moral limits regarding fundamental freedoms and markets.

KW - Biometric data

KW - Emotion data

KW - Data ethics

KW - Data Dividends

KW - Personal information management system

KW - Personal data stores

U2 - https://doi.org/10.1007/s00146-023-01739-5

DO - https://doi.org/10.1007/s00146-023-01739-5

M3 - Article

JO - AI & Society

JF - AI & Society

ER -