Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques

Iqbal, Zahid (2023) Communication Efficient Decentralized Collaborative Learning Of Heterogeneous Deep Learning Models Using Distillation And Incentive-Based Techniques. PhD thesis, Universiti Sains Malaysia.

[img]
Preview
PDF
Download (405kB) | Preview

Abstract

Smart devices, collectively, have very valuable and real-time data which can be used to train very efficient deep learning models for AI applications. However, due to the sensitive nature of this data, people are more concerned about the privacy of their data and not willing to share it. Therefore, there is a need to learn from this valuable data in a decentralized fashion by withholding data localized on these intended devices and efficiently performing necessary computation on these devices by exploiting their computational resources. Statistical heterogeneity and fully model heterogeneity are among the key challenges in applying the Decentralized Learning (DL) approaches in real scenarios. Typically, all existing DL techniques assume that all devices would have homogeneous model architecture. However, in real applications of DL, due to different computational resources and distinct business needs of devices, it is intuitive that they may have completely different model architectures. Very limited work has been performed to address fully model heterogeneity problem. In the same way, some work has been performed to address the statistical heterogeneity however mostly is hard to apply in real scenarios or is only for limited use cases.

Item Type: Thesis (PhD)
Subjects: Q Science > QA Mathematics > QA75.5-76.95 Electronic computers. Computer science
Divisions: Pusat Pengajian Sains Komputer (School of Computer Sciences) > Thesis
Depositing User: Mr Mohammad Harish Sabri
Date Deposited: 07 Mar 2024 08:03
Last Modified: 07 Mar 2024 08:03
URI: http://eprints.usm.my/id/eprint/60077

Actions (login required)

View Item View Item
Share