Article ID Journal Published Year Pages File Type
11030119 Future Generation Computer Systems 2019 39 Pages PDF
Abstract
Therefore, this paper proposes a cloud-assisted CNN framework, named FitCNN, with incremental learning and low data transmission, to reduce the overhead of updating CNNs deployed on devices. To reduce the data transmission during incremental learning, we propose a strategy, called Distiller, to selectively upload the data that is worth learning, and develop an extracting strategy, called Juicer, to choose light amount of weights from the new CNN model generated on the cloud to update the corresponding old ones on devices. Experimental results show that the Distiller strategy can reduce 39.4% data transmission of uploading based on a certain dataset, and the Juicer strategy reduces by more than 60% data transmission of updating with multiple CNNs and datasets.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , , , , , , , ,