论文部分内容阅读
Neural attention-based models have been widely used recently in head-line generation by mapping source document to target headline.However,the traditional neural headline generation models utilize the first sentence of the doc-ument as the training input while ignoring the impact of the document concept information on headline generation.In this work,A new neural attention-based model called concept sensitive neural headline model is proposed,which con-nects the concept information of the document to input text for headline genera-tion and achieves satisfactory results.Besides,we use a multi-layer Bi-LSTM in encoder instead of single layer.Experiments have shown that our model outper-forms state-of-the-art systems on DUC-2004 and Gigaword test sets.