论文部分内容阅读
In this paper,we propose a neural graph-based dependency parsing model which utilizes hierarchical LSTM networks on character level and word level to learn word representations,allowing our model to avoid the problem of limited-vocabulary and capture both distributional and compositional semantic information.Our model achieves state-of-the-art accuracy on Chinese Penn Treebank and competitive accuracy on English Penn Treebank with only first-order features.Moreover,our model shows effectiveness in recovering dependencies involving out-of-vocabulary words.