Addressing indirect discrimination and gender stereotypes in AI virtual personal assistants: the role of international human rights law

Show simple item record

dc.date.accessioned 2022-08-17T13:18:40Z
dc.date.available 2022-08-17T13:18:40Z
dc.date.issued 2020-01-07 en
dc.identifier.uri http://hdl.handle.net/20.500.11910/15077
dc.description.abstract Virtual personal assistants (VPAs) are increasingly becoming a common aspect of everyday living. However, with female names, voices and characters, these devices appear to reproduce harmful gender stereotypes about the role of women in society and the type of work women perform. Designed to assist, VPAs such as Apples Siri and Amazons Alexa reproduce and reify the idea that women are subordinate to men, and exist to be used by men. Despite their ubiquity, these aspects of their design have seen little critical attention in scholarship, and the potential legal responses to this issue have yet to be fully canvassed. Accordingly, this article sets out to critique the reproduction of negative gender stereotypes in VPAs and explores the provisions and findings within international womens rights law to assess both how this constitutes indirect discrimination and possible means for redress. In this regard, this article explores the obligation to protect women from discrimination at the hands of private actors under the Convention on the Elimination of All Forms of Discrimination Against Women, and the work of the Committee on Discrimination Against Women on gender stereotyping. With regard to corporate human rights responsibilities, the role of the United Nations Guiding Principles on Business and Human Rights is examined, as well as domestic enforcement mechanisms for international human rights norms and standards, noting the limitations to date in enforcing human rights compliance by multinational private actors. en
dc.format.medium Print en
dc.subject WOMENS RIGHTS en
dc.subject INDIRECT DISCRIMINATION en
dc.subject INTERNATIONAL HUMAN RIGHTS LAW en
dc.title Addressing indirect discrimination and gender stereotypes in AI virtual personal assistants: the role of international human rights law en
dc.type Journal Article en
dc.description.version N en
dc.ProjectNumber N/A en
dc.Volume 8(2) en
dc.BudgetYear 2019/20 en
dc.ResearchGroup Research Use and Impact Assessment en
dc.SourceTitle Cambridge International Law Journal en
dc.PlaceOfPublication Cambridge en
dc.ArchiveNumber 11121 en
dc.PageNumber 241-257 en
dc.outputnumber 10231 en
dc.bibliographictitle Adams, R. & Loideain, N. (2019) Addressing indirect discrimination and gender stereotypes in AI virtual personal assistants: the role of international human rights law. Cambridge International Law Journal. 8(2):241-257. http://hdl.handle.net/20.500.11910/15077 en
dc.publicationyear 2019 en
dc.contributor.author1 Adams, R. en
dc.contributor.author2 Loideain, N. en


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record