Artificial intelligence has a gender-bias problem - just ask Siri

Show simple item record

dc.date.accessioned 2020-04-17 en
dc.date.accessioned 2022-08-17T13:05:17Z
dc.date.available 2022-08-17T13:05:17Z
dc.date.issued 2020-04-19 en
dc.identifier.uri http://hdl.handle.net/20.500.11910/15262
dc.description.abstract All the virtual personal assistants on the market today come with a default female voice and are programmed to respond to all kinds of suggestive questions. Does their design as stereotyped females suggest that in the midst of a global technological revolution, women remain trapped in traditional roles and personalities of the past? en
dc.format.medium Print en
dc.publisher HSRC Press en
dc.subject ARTIFICIAL INTELLIGENCE (AI) en
dc.subject VIRTUAL PERSONAL ASSISTANT en
dc.subject SIRI en
dc.subject GENDER EQUALITY en
dc.title Artificial intelligence has a gender-bias problem - just ask Siri en
dc.type Journal Article en
dc.description.version N en
dc.ProjectNumber N/A en
dc.Volume 18(1) en
dc.BudgetYear 2019/20 en
dc.ResearchGroup Research Use and Impact Assessment en
dc.SourceTitle HSRC Review en
dc.PlaceOfPublication Cape Town en
dc.ArchiveNumber 11311 en
dc.PageNumber 14-15 en
dc.outputnumber 10439 en
dc.bibliographictitle Adams, R. (2020) Artificial intelligence has a gender-bias problem - just ask Siri. HSRC Review. 18(1):14-15. http://hdl.handle.net/20.500.11910/15262 en
dc.publicationyear 2020 en
dc.contributor.author1 Adams, R. en


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record