深度学习用于文本摘要的若干资源(论文+代码),并对文本摘要的任务、评估指标、相关工作有简单评述.
1. A Neural Attention Model for Abstractive Summarization (NAMAS).[Rush et al, EMNLP'15].
2. Sequence-to-Sequence with Attention Model for Text Summarization (textsum) [Google Brain, Github 2016].
3. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond [Nallapati et al, CoNLL'16],用到了来自如下两篇文章的指针思想:Addressing the rare word problem in neural machine translation[Luong et al, ACL'15], Pointer Networks[Vinyals et al, NIPS'15].
4. Neural Summarization by Extracting Sentences and Words [Cheng & Lapata, ACL'16].
链接:
https://github.com/lipiji/App-DL
原文链接:
http://weibo.com/2536116592/EC0D0oOEE?type=comment