Article ID Journal Published Year Pages File Type
6873104 Future Generation Computer Systems 2018 10 Pages PDF
Abstract
In this work, we propose T1000 to re-fuse the convolutions across tensors after applying the canonical polyadic decomposition to conventional convolution layers so that we can receive the benefit of reduced computation complexity, in the meanwhile mitigate the memory occupancy of the intermediate data. We demonstrate the effectiveness of our approach by applying canonical polyadic decomposition and re-fusion to the convolution layers of two well-known convolution neural networks, AlexNet and VGG-19 implemented with Caffe. Compared to the default canonical polyadic decomposition, our approach reduces the memory occupancy of the intermediate data by 84.6% and 77.4% for AlexNet and VGG-19 respectively. In addition, our approach improves the performance of AlexNet and VGG-19 by 1.77× and 1.4× respectively.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , , , ,