Long Short-Term Memory (LSTM) networks are a type of recurrent neural
network capable of learning order dependence in sequence prediction
problems.
This is a behavior required in complex problem domains like machine translation, speech recognition, and more.
LSTMs are a complex area of deep learning. It can be hard to get your
hands around what LSTMs are, and how terms like bidirectional and
sequence-to-sequence relate to the field.
In this post, you will get insight into LSTMs using the words of
research scientists that developed the methods and applied them to new
and important problems.
There are few that are better at clearly and precisely articulating
both the promise of LSTMs and how they work than the experts that
developed them.
We will explore key questions in the field of LSTMs using quotes from
the experts, and if you’re interested, you will be able to dive into
the original papers from which the quotes were taken.
链接:
http://machinelearningmastery.com/gentle-introduction-long-short-term-memory-networks-experts/
原文链接:
http://weibo.com/1402400261/F4GFtEBOP?from=page_1005051402400261_profile&wvr=6&mod=weibotime&type=comment