Hide metadata

dc.date.accessioned2022-04-19T16:41:36Z
dc.date.available2022-04-19T16:41:36Z
dc.date.created2021-10-21T18:14:54Z
dc.date.issued2021
dc.identifier.citationNoori, Farzan Majeed Uddin, Md Zia Tørresen, Jim . Ultra-Wideband Radar-Based Activity Recognition Using Deep Learning. IEEE Access. 2021, 9, 138132-138143
dc.identifier.urihttp://hdl.handle.net/10852/93594
dc.description.abstractWith recent advances in the field of sensing, it has become possible to build better assistive technologies. This enables the strengthening of eldercare with regard to daily routines and the provision of personalised care to users. For instance, it is possible to detect a person’s behaviour based on wearable or ambient sensors; however, it is difficult for users to wear devices 24/7, as they would have to be recharged regularly because of their energy consumption. Similarly, although cameras have been widely used as ambient sensors, they carry the risk of breaching users’ privacy. This paper presents a novel sensing approach based on deep learning for human activity recognition using a non-wearable ultra-wideband (UWB) radar sensor. UWB sensors protect privacy better than RGB cameras because they do not collect visual data. In this study, UWB sensors were mounted on a mobile robot to monitor and observe subjects from a specific distance (namely, 1.5–2.0 m). Initially, data were collected in a lab environment for five different human activities. Subsequently, the data were used to train a model using the state-of-the-art deep learning approach, namely long short-term memory (LSTM). Conventional training approaches were also tested to validate the superiority of LSTM. As a UWB sensor collects many data points in a single frame, enhanced discriminant analysis was used to reduce the dimensions of the features through application of principal component analysis to the raw dataset, followed by linear discriminant analysis. The enhanced discriminant features were fed into the LSTMs. Finally, the trained model was tested using new inputs. The proposed LSTM-based activity recognition approach performed better than conventional approaches, with an accuracy of 99.6%. We applied 5-fold cross-validation to test our approach. We also validated our approach on publically available dataset. The proposed method can be applied in many prominent fields, including human–robot interaction for various practical applications, such as mobile robots for eldercare.
dc.languageEN
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleUltra-Wideband Radar-Based Activity Recognition Using Deep Learning
dc.typeJournal article
dc.creator.authorNoori, Farzan Majeed
dc.creator.authorUddin, Md Zia
dc.creator.authorTørresen, Jim
cristin.unitcode185,15,5,46
cristin.unitnameForskningsgruppe for robotikk og intelligente systemer
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.cristin1947653
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=IEEE Access&rft.volume=9&rft.spage=138132&rft.date=2021
dc.identifier.jtitleIEEE Access
dc.identifier.volume9
dc.identifier.startpage138132
dc.identifier.endpage138143
dc.identifier.doihttps://doi.org/10.1109/ACCESS.2021.3117667
dc.identifier.urnURN:NBN:no-96158
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn2169-3536
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/93594/1/Noori_2021_Ultra-wideband.pdf
dc.type.versionPublishedVersion
dc.relation.projectNFR/312333
dc.relation.projectNFR/247697
dc.relation.projectNFR/288285
dc.relation.projectNFR/262762


Files in this item

Appears in the following Collection

Hide metadata

Attribution 4.0 International
This item's license is: Attribution 4.0 International