专栏名称: 机器之心
专业的人工智能媒体和产业服务平台
目录
相关文章推荐
爱可可-爱生活  ·  //@爱可可-爱生活:「AI深度思考」专题汇 ... ·  昨天  
爱可可-爱生活  ·  【Genesis:一个为通用机器人和具身AI ... ·  3 天前  
爱可可-爱生活  ·  【KeymouseGo:像按键精灵一样的自动 ... ·  3 天前  
爱可可-爱生活  ·  【LLM Confabulation ... ·  3 天前  
51好读  ›  专栏  ›  机器之心

总结 | 2016年最值得读的自然语言处理领域Paper

机器之心  · 公众号  · AI  · 2017-01-03 13:30

正文

引言


经过大家的投票和补充,paperweekly选出了15篇2016年最值得读的自然语言处理领域相关Paper,排序按照时间顺序,覆盖了几大热门研究方向。


1、Learning to Compose Neural Networks for Question Answering

作者

Jacob Andreas, Marcus Rohrbach, Trevor Darrell, Dan Klein

单位

Department of Electrical Engineering and Computer Sciences
University of California, Berkeley

关键词

Question Answering


2、Text understanding with the attention sum reader network

作者

Rudolf Kadlec, Martin Schmid, Ondrej Bajgar, Jan Kleindienst

单位

IBM Watson

关键词

Machine Reading Comprehension


3、Improving Information Extraction by Acquiring External Evidence with Reinforcement Learning

作者

Karthik Narasimhan, Adam Yala, Regina Barzilay

单位

CSAIL, MIT

关键词

Information Extraction; Reinforcement Learning


4、Pointing the Unknown Words

作者

Caglar Gulcehre, Sungjin Ahn, Ramesh Nallapati, Bowen Zhou, Yoshua Bengio

单位

Universite de Montr´eal
IBM T.J. Watson Research
CIFAR Senior Fellow

关键词

Unknown Words


5、Sequence-to-Sequence Learning as Beam-Search Optimization

作者

Sam Wiseman, Alexander M. Rush

单位

School of Engineering and Applied Sciences, Harvard University

关键词

Seq2Seq; Beam Search


6、SQuAD: 100,000+ Questions for Machine Comprehension of Text

作者

Pranav Rajpurkar, Jian Zhang, Konstantin Lopyrev, Percy Liang

单位

Computer Science Department
Stanford University

关键词

Machine Reading Comprehension; Dataset


7、End-to-End Reinforcement Learning of Dialogue Agents for Information Access

作者

Bhuwan Dhingra, Lihong Li, Xiujun Li, Jianfeng Gao, Yun-Nung Chen, Faisal Ahmed, Li Deng

单位

School of Computer Science, Carnegie Mellon University
Microsoft Research
National Taiwan University

关键词

Reinforcement Learning; Dialogue System


8、ReasoNet: Learning to Stop Reading in Machine Comprehension

作者

Yelong Shen, Po-Sen Huang, Jianfeng Gao, Weizhu Chen

单位

Microsoft Research Redmond

关键词

Machine Reading Comprehension


9、Personalizing a Dialogue System with Transfer Learning

作者

Kaixiang Mo, Shuangyin Li, Yu Zhang, Jiajun Li, Qiang Yang

单位

The Hong Kong University of Science and Technology

关键词

Dialogue System; Transfer Learning


10、LightRNN Memory and Computation-Efficient Recurrent Neural Network

作者

Xiang Li, Tao Qin, Jian Yang, Tie-Yan Liu

单位

Nanjing University of Science and Technology
Microsoft Research Asia

关键词

New Recurrent Neural Network


11、Dual Learning for Machine Translation

作者

Yingce Xia, Di He, Tao Qin, Liwei Wang, Nenghai Yu, Tie-Yan Liu, Wei-Ying Ma

单位

University of Science and Technology of China
Key Laboratory of Machine Perception (MOE), School of EECS, Peking University
Microsoft Research

关键词

Dual Learning; Neural Machine Translation


12、Neural Machine Translation with Reconstruction

作者

Zhaopeng Tu, Yang Liu, Lifeng Shang, Xiaohua Liu, Hang Li

单位

Noah’s Ark Lab, Huawei Technologies
Department of Computer Science and Technology, Tsinghua University

关键词

Neural Machine Translation


13、Linguistically Regularized LSTMs for Sentiment Classification

作者

Qiao Qian, Minlie Huang, Xiaoyan Zhu

单位

State Key Lab. of Intelligent Technology and Systems, National Lab. for Information Science and Technology
Dept. of Computer Science and Technology, Tsinghua University

关键词

Sentiment Classification; LSTM


14、Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation

作者

Melvin Johnson, Mike Schuster, Quoc V. Le, Maxim Krikun, Yonghui Wu, Zhifeng Chen, Nikhil Thorat, Fernanda Viégas, Martin Wattenberg, Greg Corrado, Macduff Hughes, Jeffrey Dean

单位

Google

关键词

Multilingual Neural Machine Translation; Zero-Shot


15、Language Modeling with Gated Convolutional Networks

作者

Yann N. Dauphin, Angela Fan, Michael Auli, David Grangier

单位

Facebook AI Research

关键词

Language Modeling; Gated CNN


如果您觉得还有非常不错的NLP Paper没有出现在这个list中,请留言或移步到此处进行补充或评论



下载所有Paper请戳这里



关于PaperWeekly


PaperWeekly是一个分享知识和交流学问的学术组织,关注的领域是NLP的各个方向。如果你也经常读paper,也喜欢分享知识,也喜欢和大家一起讨论和学习的话,请速速来加入我们吧。




微信公众号:PaperWeekly

微博账号:PaperWeekly(http://weibo.com/u/2678093863 )
微信交流群:微信+ zhangjun168305(请备注:加群交流或参与写paper note)