Are we manipulated by some artificial voices when we want to buy products or services online? Researchers from the University of Luxembourg have investigated this topic to measure the impact of synthetic voice on decision making.
From advice to manipulation
The increase of transaction-oriented interactions with such systems opens up possibility to manipulate user decisions and, in turn, negatively affect their finances. “We found that the type of computer-generated speech (synthetic speech) can affect participants’ choices, independently from the content of presented options. Users are more likely to select options provided by a highly natural synthetic voice compared to a more robotically sounding one. Our results also indicate that perceived engagement, ease of understanding, and domain fit of synthetic speech directly translate to its impact on participants’ behaviour in decision making tasks. Interestingly, the users seem to underestimate the extent to which voice can sway them towards certain choices, which can potentially make them vulnerable to manipulation”, explains Dr Mateusz Dubiel from the Department of Computer Science at the University of Luxembourg.
Interdisciplinary work
This work is an outcome of an interdisciplinary collaboration between researchers from the Faculty of Science Technology and Medicine, and the Faculty of Humanities, Education and Social Sciences. It combines insights from fields of speech processing, cognitive psychology, and behavioural change to demonstrate that the type of synthetic speech can not only affect its perception but also impact users’ behaviour. Based on this finding, we propose a set of design implications for voice-based conversational agents to make their development ethical and more accessible while also promoting user agency.
The paper was presented at the ACM Intelligent User Interfaces 2024 which took place from 18 to 21 March 2024 in Greenville, USA. This yearly event gathers researchers and practitioners to discuss state-of-the-art advances at the intersection of artificial intelligence and human-computer interaction.
This work was supported by the Horizon 2020 FET programme of the European Union through the ERA-NET Cofund funding (BANANA, grant CHIST-ERA-20-BCI-001) and Horizon Europe’s European Innovation Council through the Pathfinder program (SYMBIOTIK, grant 101071147). This work is also supported by the Luxembourg National Research Fund (FNR) Decepticon (grant no.IS/14717072).
Publication: Impact of Voice Fidelity on Decision Making: A Potential Dark Pattern? by Mateusz Dubiel, Anastasia Sergeeva, and Luis A. Leiva.