Not logged in.
Quick Search - Contribution
Contribution Details
Type | Book Chapter |
Scope | Contributions to practice |
Title | Female by Default? – Exploring the Effect of Voice Assistant Gender and Pitch on Trait and Trust Attribution |
Organization Unit |
|
Authors |
|
Editors |
|
Item Subtype | Original Work |
Refereed | Yes |
Status | Published in final form |
Language |
|
Booktitle | CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems |
ISBN | 978-1-4503-8095-9 |
Place of Publication | New York, NY, USA |
Publisher | ACM |
Page Range | Art. 455 |
Date | 2021 |
Abstract Text | Gendered voice based on pitch is a prevalent design element in many contemporary Voice Assistants(VAs) but has shown to strengthen harmful stereotypes. Interestingly, there is a dearth of research that systematically analyses user perceptions of different voice genders in VAs. This study investigates gender-stereotyping across two different tasks by analyzing the influence of pitch (low, high) and gender (women, men) on stereotypical trait ascription and trust formation in an exploratory online experiment with 234 participants. Additionally, we deploy a gender-ambiguous voice to compare against gendered voices. Our findings indicate that implicit stereotyping occurs for VAs. Moreover, we can show that there are no significant differences in trust formed towards a gender-ambiguous voice versus gendered voices, which highlights their potential for commercial usage. |
Related URLs | |
Digital Object Identifier | 10.1145/3411763.3451623 |
Other Identification Number | merlin-id:21086 |
Export |
![]() |