| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 6939820 | Pattern Recognition | 2017 | 10 Pages |
Abstract
In this paper, multimodal deep learning for solar radio burst classification is proposed. We make the first attempt to build multimodal learning network to learn the joint representation of the solar radio spectrums captured from different frequency channels, which are treated as different modalities. In order to learn the representation of each modality and the correlation and interaction between different modalities, autoencoder together with the structured regularization is used to enforce and learn the modality-specific sparsity and density of each modality, respectively. Fully connected layers are further employed to exploit the relationships between different modalities for the joint representation generation of the solar radio spectrums. Based on the learned joint representation, solar radio burst classification is performed. With the validation on the constructed solar radio spectrum database, experimental results have demonstrated that the proposed multimodal learning network can effectively learn the representation of the solar radio spectrum, and improve the classification accuracy.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
Lin Ma, Zhuo Chen, Long Xu, Yihua Yan,
