Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6940519 | Pattern Recognition Letters | 2018 | 9 Pages |
Abstract
Deep Learning has been widely applied for dependency parsing in recent years. In this paper, we propose an effective deep neural network model for graph-based dependency parsing. In our model, first, a special feature extraction layer is elaborately designed by combining the bidirectional Long Short-Term Memory (BLSTM) and the segment-based Convolutional Neural Network (SCNN), which is able to capture rich contextual information of the sentence for parsing. Then, the features learnt in feature extraction layer are fed into the standard feed-forward network, which is trained with max-margin criteria and makes predictions for dependency labels. Finally, to search the best dependency structure for the sentence from the dependency graph, the classical dynamic programming algorithm is used. In our experiment, we test the proposed model on 14 different languages including English, Chinese and German, whose results show that the proposed model achieves competitive accuracies in unlabeled attachment scores and labeled attachment scores compared with state-of-the-art dependency parsers. What's more, the model shows better ability in recovering long-distance dependencies compared with common neural network models.
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Vision and Pattern Recognition
Authors
Si Nianwen, Wang Hengjun, Shan Yidong,