Show simple item record

dc.contributor.authorBispham, Mary
dc.contributor.authorZard, Clara
dc.contributor.authorSattar, Suliman
dc.contributor.authorFerrer-Aran, Xavier
dc.contributor.authorSuarez-Tangil, Guillermo 
dc.contributor.authorSuch, Jose
dc.date.accessioned2022-06-30T09:28:49Z
dc.date.available2022-06-30T09:28:49Z
dc.date.issued2022-07
dc.identifier.urihttp://hdl.handle.net/20.500.12761/1599
dc.description.abstractIn this paper we investigate the issue of sensitive information leak- age to third-party voice applications in voice assistant ecosystems. We focus specifically on leakage of sensitive information via the conversational interface. We use a bespoke testing infrastructure to investigate leakage of sensitive information via the conversational interface of Google Actions and Alexa Skills. Our work augments prior work in this area to consider not only specific categories of personal data, but also other types of potentially sensitive in- formation that may be disclosed in voice-based interactions with third-party voice applications. Our findings indicate that current privacy and security measures for third-party voice applications are not sufficient to prevent leakage of all types of sensitive information via the conversational interface. We make key recommendations for the redesign of voice assistant architectures to better prevent leakage of sensitive information via the conversational interface of third-party voice applications in the future.es
dc.description.sponsorshipRYC-2020-029401-Ies
dc.language.isoenges
dc.titleLeakage of Sensitive Information to Third-Party Voice Applicationses
dc.typeconference objectes
dc.conference.date26–28 July 2022es
dc.conference.placeGlasgow, UKes
dc.conference.titleConversational User Interfaces (CUI)*
dc.event.typeconferencees
dc.pres.typepaperes
dc.type.hasVersionAOes
dc.rights.accessRightsopen accesses
dc.description.refereedTRUEes
dc.description.statuspubes


Files in this item

This item appears in the following Collection(s)

Show simple item record