Deep learning approaches for fingerprint localization in low-power wide area networks

dc.contributor.advisorMyburgh, Hermanus Carel
dc.contributor.coadvisorDe Freitas, Allan
dc.contributor.emailalbert.lutakamale@tuks.co.zaen_US
dc.contributor.postgraduateLutakamale, Albert Selebea
dc.date.accessioned2025-02-13T15:09:56Z
dc.date.available2025-02-13T15:09:56Z
dc.date.created2025-05
dc.date.issued2024-10
dc.descriptionThesis (PhD (Electronic Engineering))--University of Pretoria, 2024.en_US
dc.description.abstractIn recent years, low-power wide area networks (LPWANs), particularly long-range wide area networks (LoRaWAN), have been increasingly adopted into large-scale internet of things (IoT) applications due to their ability to offer energy-efficient and cost-effective long-range wireless communication. The need to provide location-stamped communications to IoT applications for meaningful interpretation of physical measurements from IoT devices has increased the demand to incorporate location estimation capabilities into LPWAN networks. Factors such as high-power consumption, high implementation costs, and poor localization performance in urban canyons or environments with many obstructions render outdoor localization solutions based on standalone GPS technology unfit for deployment in large-scale IoT applications, where the emphasis is on energy efficiency and cost-effectiveness. Implementing localization methods in short-range wireless communication networks, such as Bluetooth and ZigBee networks, to estimate locations of target nodes in large outdoor environments is also not economically feasible due to their short-range nature, as there will be a requirement for dense deployment of wireless nodes, leading to high implementation costs. In LoRaWAN (one of the key LPWAN technologies operating in unlicensed frequency bands), fingerprint-based localization methods are known to be robust in challenging environments with multipath and non-line-of-sight phenomena, making them relatively more accurate than range-based methods. However, most currently available fingerprint-based localization methods in LoRaWAN networks rely on conventional ‘shallow’ machine learning models. While such models may yield satisfactory results under specific conditions, their complexity tends to increase as the size of training datasets increases, ultimately resulting in a decline in localization accuracy. In this thesis, driven by the goal of improving the performance and efficiency of fingerprint-based localization methods in LoRaWAN networks, two deep learning-based fingerprint-based methods to estimate the locations of target nodes in LoRaWAN networks are proposed. The first proposed method is a branched convolutional neural network (CNN) localization method enhanced with squeeze and excitation (SE) blocks (referred to as the CNN-SE method). The second proposed method is a hybrid CNN-transformer fingerprint-based localization method (referred to as the CNN-transformer method). The main contribution of the first method is the joint use of CNN (proven to be very efficient in learning useful positional information in structured data) and SE blocks, which improves channel-wise interdependencies. The novel contribution of the second method is the development of a hybrid CNN-transformer fingerprinting-based localization model by leveraging the strengths of both CNNs and transformers. CNNs capture features from the input data at the local level, while the attention mechanism of the transformer captures features from the input data at the global level. Adopting a 0.7/0.15/0.15 data split scheme for the training, validation, and test set, respectively, and using the entire LoRaWAN dataset, the CNN-SE method achieved localization accuracies of 291.51 m and 147.55 m mean and median localization errors, respectively, on the test set, using the powed data representation scheme. With the CNN-transformer method, the localization accuracy of 288.1 m and 143.7 m mean and median localization errors, respectively, were achieved, using the same experimental settings. The localization accuracies achieved by these two methods have outperformed the localization accuracies of the currently available state-of-the-art fingerprint-based localization methods in the literature, evaluated using the same publicly available LoRaWAN dataset. An R2 score of 0.93 obtained by both methods further indicates the high degree to which the proposed methods have been able to fit data in their respective regressors, enabling them to localize target nodes with satisfactory localization accuracies.en_US
dc.description.availabilityUnrestricteden_US
dc.description.degreePhD (Electronic Engineering)en_US
dc.description.departmentElectrical, Electronic and Computer Engineeringen_US
dc.description.facultyFaculty of Engineering, Built Environment and Information Technologyen_US
dc.description.sdgSDG-11: Sustainable cities and communitiesen_US
dc.description.sponsorshipMinistry of Education, Science, and Technology of the United Republic of Tanzania under the MoEST-HEET Project funden_US
dc.identifier.citation*en_US
dc.identifier.doi10.25403/UPresearchdata.28409261en_US
dc.identifier.otherA2025en_US
dc.identifier.urihttp://hdl.handle.net/2263/100865
dc.language.isoenen_US
dc.publisherUniversity of Pretoria
dc.rights© 2023 University of Pretoria. All rights reserved. The copyright in this work vests in the University of Pretoria. No part of this work may be reproduced or transmitted in any form or by any means, without the prior written permission of the University of Pretoria.
dc.subjectUCTDen_US
dc.subjectSustainable Development Goals (SDGs)en_US
dc.subjectConvolutional neural networksen_US
dc.subjectDeep learningen_US
dc.subjectFingerprint localizationen_US
dc.subjectLong-range wide area networken_US
dc.subjectLow-power wide area networken_US
dc.subjectInternet of Things (IoT)en
dc.titleDeep learning approaches for fingerprint localization in low-power wide area networksen_US
dc.typeThesisen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Lutakamale_Deep_2024.pdf
Size:
23.06 MB
Format:
Adobe Portable Document Format
Description:
Thesis

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: