Attitudes Towards Emotional Artificial Intelligence Use: Transcripts of Citizen Workshops Collected Using an Innovative Narrative Approach, 2021

DOI

The data were collected during citizen workshops, conducted online via Zoom, exploring attitudes towards emotional artificial intelligence use (EAI). EAI is the use of affective computing and AI techniques to try to sense and interact with human emotional life, ranging from monitoring emotions through biometric data to more active interventions. 10 sets of participants (n=46) were recruited for the following groups: 3 older (65+) groups: n=13 3 younger (18-34) groups: n=12 2 groups, people self-identifying as disabled: n=10 2 groups, members of UK ethnic minorities: n=11 There was an attempt to balance other demographic categories where possible. Participants were grouped in relation to age as this has been shown to be the biggest indicator of differences in attitude towards emotional AI (Bakir & McStay, 2020; McStay, 2020). It was also considered important to include the views of those who have traditionally been ignored in the development of technology or suffered further discrimination through its use, and so the opinions and perspectives of minority groups and disabled people were sought. Participants were recruited through a research panel for the workshops, which took place in August 2021. A novel narrative approach was used, with participants taken through a piece of interactive fiction (developed using Twine, viewable here: https://eaitwine.neocities.org/), a day-in-the life story of a protagonist encountering seven mundane use-cases of emotional AI, each structured as a) a neutral introduction to the technology; b) a binary choice involving the use of the technology; c) a ContraVision component demonstrating positive and negative events/outcomes. The use cases were: • Home-hub smart assistant • Bus station surveillance sensor • Social Media Fake news/Disinformation and profiling. • Spotify music recommendations (using voice and ambient data). • Sales call evaluation and prompt tool • Emotoy that collects and responds to children's emotional data. • Hire car in-cabin customisation and driving support. Each workshop lasted 2 hours. Audio files were transcribed using a transcription service before being corrected and formatted by a project researcher. References: Bakir, V., & McStay, A. (2020). Profiling & Targeting Emotions in Digital Political Campaigns. Briefing Paper for All Party Parliamentary Group on Electoral Campaigning Transparency. McStay, A. (2020). Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy. Big Data & Society, 7(1), 1–12. https://doi.org/10.1177/2053951720904386CONTEXT Emotional AI (EAI) technologies sense, learn and interact with citizens' emotions, moods, attention and intentions. Using weak and narrow rather than strong AI, machines read and react to emotion via text, images, voice, computer vision and biometric sensing. Concurrently, life in cities is increasingly technologically mediated. Data-driven sensors, actuators, robots and pervasive networking are changing how citizens experience cities, but not always for the better. Citizen needs and perspectives are often ancillary in emerging smart city deployments, resulting in mistrust in new civic infrastructure and its management (e.g. Alphabet's Sidewalk Labs). We need to avoid these issues repeating as EAI is rolled out in cities. Reading the body is an increasingly prevalent concern, as recent pushback against facial detection and recognition technologies demonstrates. EAI is an extension of this, and as it becomes normalised across the next decade we are concerned about how these systems are governed, social impacts on citizens, and how EAI can be designed in a more ethical manner. In both Japan and UK, we are at a critical juncture where these social, technological and governance structures can be appropriately prepared before mass adoption of EAI, to enable citizens, in all their diversity, to live ethically and well with EAI in cities-as-platforms. Building on our ESRC/AHRC seminars in Tokyo (2019) that considered cross-cultural ethics and EAI, our research will enable a multi-stakeholder (commerce, security, media) and citizen-led interdisciplinary response to EAI for Japan and UK. While these are two of the most advanced nations in regard to AI, the social contexts and histories from which these technologies emerge differ, providing rich scope for reflection and mutual learning. AIMS/OBJECTIVES 1. To assess what it means to live ethically and well with EAI in cities in cross-cultural (UK-Japan) commercial, security and media contexts. 2. To map and engage with the ecology of influential actors developing and working with EAI in UK-Japan. 3. To understand commercial activities, intentions and ethical implications regarding EAI in cities, via interviews with industry, case studies, and analysis of patents. 4. To ascertain how EAI might impact security/policing stakeholders, and organisations in the new media ecology, via interviews with these stakeholders and case studies in UK-Japan. 5. To examine governance approaches for collection and use of intimate data about emotions in public spaces to understand how these guide EAI technological developments, and to build a repository of best practice on EAI in cities. 6. To understand diverse citizens' attitudes to EAI via quantitative national surveys and qualitative workshops to co-design citizen-led, creative visions of what it means to live ethically and well with EAI in cities in UK-Japan. 8. To feed our insights to stakeholders shaping usage of EAI in cities in UK-Japan. 9. To advance surveillance studies, new media studies, information technology law, science & technology studies, security & policing studies, computer ethics and affective computing via: 24 international conference papers; a conference on EAI; 12 international, refereed journal papers; a Special Issue on EAI. APPLICATIONS/BENEFITS We will: - Raise awareness of UK-Japanese stakeholders (technology industry, policymakers, NGOs, security services, urban planners, media outlets, citizens) on how to live ethically and well with EAI in cities, via co-designed, citizen-led, qualitative visions fed into Stakeholder Policy Workshops; a Final Report with clear criteria on ethical usage of EAI in cites; 24 talks with stakeholders; multiple news stories. - Set up a think tank to provide impartial ethical advice on EAI and cross-cultural issues to diverse stakeholders during and after the project. - Advance collaboration between UK-Japan academics, disciplines and stakeholders in EAI.

An innovative narrative approach to collecting rich qualitative data from participants in an online setting was employed. A multimodal narrative was created using Twine, an interactive fiction writing tool. The narrative was developed drawing on ideas and concepts from Design Fiction, chiefly the use of diegetic prototypes - designed objects or technologies that exist within a fictional world - and incorporates elements of ContraVision, where positive and negative outcomes of the same scenario are shared with participants. This was presented to participants via Zoom (online video conferencing software) during an online workshop. Discussion was invited at different points of the narrative. 10 sets of participants (n=46) were recruited for the following groups: 3 older (65+) groups: n=13 3 younger (18-34) groups: n=12 2 groups, people self-identifying as disabled: n=10 2 groups, members of UK ethnic minorities: n=11 There was an attempt to balance other demographic categories where possible. Participants were grouped in relation to age as this has been shown to be the biggest indicator of differences in attitude towards emotional AI. It was also considered important to include the views of those who have traditionally been ignored in the development of technology or suffered further discrimination through its use, and so the opinions and perspectives of minority groups and disabled people were sought. Participants were recruited through a research panel.

Identifier
DOI https://doi.org/10.5255/UKDA-SN-855688
Metadata Access https://datacatalogue.cessda.eu/oai-pmh/v0/oai?verb=GetRecord&metadataPrefix=oai_ddi25&identifier=5906697b9d0f7866d8702db6da19232a6e0193b28aa9e71f3f076253fa1c0c42
Provenance
Creator Laffer, A, Bangor University
Publisher UK Data Service
Publication Year 2022
Funding Reference Economic and Social Research Council
Rights Andrew McStay, Bangor University. Vian Bakir, Bangor University. Alexander Laffer, Bangor University. Lachlan Urquhart, University of Edinburgh. Diana Miranda, University of Stirling; The Data Collection is available for download to users registered with the UK Data Service.
OpenAccess true
Representation
Resource Type Text
Discipline Jurisprudence; Law; Social and Behavioural Sciences
Spatial Coverage United Kingdon; United Kingdom