专栏名称: 机器学习研究会
机器学习研究会是北京大学大数据与机器学习创新中心旗下的学生组织,旨在构建一个机器学习从事者交流的平台。除了及时分享领域资讯外,协会还会举办各种业界巨头/学术神牛讲座、学术大牛沙龙分享会、real data 创新竞赛等活动。
目录
相关文章推荐
爱可可-爱生活  ·  在与Claude和GPT-4这样的大语言模型 ... ·  4 天前  
爱可可-爱生活  ·  晚安~ #晚安# -20250106223114 ·  4 天前  
人工智能那点事  ·  悲催!宁波小米SU7销售带客户试驾,在红绿灯 ... ·  4 天前  
51好读  ›  专栏  ›  机器学习研究会

【学习】专家怎么说:LSTM通俗介绍

机器学习研究会  · 公众号  · AI  · 2017-05-25 21:07

正文



点击上方“机器学习研究会”可以订阅哦
摘要
 

转自:爱可可-爱生活

Long Short-Term Memory (LSTM) networks are a type of recurrent neural network capable of learning order dependence in sequence prediction problems.


This is a behavior required in complex problem domains like machine translation, speech recognition, and more.


LSTMs are a complex area of deep learning. It can be hard to get your hands around what LSTMs are, and how terms like bidirectional and sequence-to-sequence relate to the field.


In this post, you will get insight into LSTMs using the words of research scientists that developed the methods and applied them to new and important problems.


There are few that are better at clearly and precisely articulating both the promise of LSTMs and how they work than the experts that developed them.


We will explore key questions in the field of LSTMs using quotes from the experts, and if you’re interested, you will be able to dive into the original papers from which the quotes were taken.


链接:

http://machinelearningmastery.com/gentle-introduction-long-short-term-memory-networks-experts/


原文链接:

http://weibo.com/1402400261/F4GFtEBOP?from=page_1005051402400261_profile&wvr=6&mod=weibotime&type=comment

“完整内容”请点击【阅读原文】
↓↓↓