Mostrar el registro sencillo del ítem

dc.contributor.authorWidyawan, Widyawan
dc.contributor.authorPirkl, Gerald
dc.contributor.authorMunaretto, Daniele
dc.contributor.authorFischer, Carl
dc.contributor.authorAn, Chunlei
dc.contributor.authorLukowicz, Paul
dc.contributor.authorKlepal, Martin
dc.contributor.authorTimm-Giel, Andreas
dc.contributor.authorWidmer, Joerg 
dc.contributor.authorPesch, Dirk
dc.date.accessioned2021-07-13T09:37:12Z
dc.date.available2021-07-13T09:37:12Z
dc.date.issued2012-06
dc.identifier.issn1574-1192
dc.identifier.urihttp://hdl.handle.net/20.500.12761/676
dc.description.abstractWe present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and where typical motion patterns defeat standard PDR systems. We use RF and ultrasound beacons to periodically re-align the PDR system and reduce the impact of incremental error accumulation. Unlike previous work on multimodal positioning, we allow the beacons to be dynamically deployed (dropped by the user) at previously unknown locations. A key contribution of this paper is to show that despite the fact that the beacon locations are not known (in terms of absolute coordinates), they significantly improve the performance of the system. This effect is especially relevant when a user re-traces (parts of) the path he or she had previously traveled or lingers and moves around in an irregular pattern at single locations for extended periods of time. Both situations are common and relevant for emergency response scenarios. We describe the system architecture, the fusion algorithms and provide an in depth evaluation in a large scale, realistic experiment.
dc.language.isoeng
dc.publisherElsevier
dc.subject.lccQ Science::Q Science (General)
dc.subject.lccQ Science::QA Mathematics::QA75 Electronic computers. Computer science
dc.subject.lccT Technology::T Technology (General)
dc.subject.lccT Technology::TA Engineering (General). Civil engineering (General)
dc.subject.lccT Technology::TK Electrical engineering. Electronics Nuclear engineering
dc.titleVirtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environmentsen
dc.typejournal article
dc.journal.titlePervasive and Mobile Computing
dc.type.hasVersionVoR
dc.rights.accessRightsopen access
dc.volume.number8
dc.issue.number3
dc.identifier.doihttp://dx.doi.org/10.1016/j.pmcj.2011.04.005
dc.page.final401
dc.page.initial388
dc.subject.keywordVirtual lifeline
dc.subject.keywordNavigation
dc.subject.keywordSensor data fusion
dc.subject.keywordUnknown environments
dc.description.refereedTRUE
dc.description.statuspub
dc.eprint.idhttp://eprints.networks.imdea.org/id/eprint/195


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem