专栏名称: 机器学习研究会
机器学习研究会是北京大学大数据与机器学习创新中心旗下的学生组织,旨在构建一个机器学习从事者交流的平台。除了及时分享领域资讯外,协会还会举办各种业界巨头/学术神牛讲座、学术大牛沙龙分享会、real data 创新竞赛等活动。
目录
相关文章推荐
宝玉xp  ·  WSJ 今天发的一篇文章:《IT ... ·  昨天  
爱可可-爱生活  ·  【Iffy:智能内容审核的高效解决方案。亮点 ... ·  昨天  
爱可可-爱生活  ·  Corca的协同数学编辑器输入公式也太顺滑了 ... ·  昨天  
新智元  ·  4500美元复刻DeepSeek神话,1.5 ... ·  2 天前  
爱可可-爱生活  ·  【Aperture:可视化Stable ... ·  3 天前  
51好读  ›  专栏  ›  机器学习研究会

【推荐】最新Oxford和DeepMind的面向自然语言处理的深度学习课程(附视频百度网盘下载)

机器学习研究会  · 公众号  · AI  · 2017-03-14 19:04

正文



点击上方 “机器学习研究会” 可以订阅哦
摘要

转自: 胜势待发就是爱学习爱锻炼爱生活

最新Oxford和DeepMind的面向自然语言处理的深度学习课程,包括课程视频、讲义和练习,点击“阅读原文”即可进入百度网盘下载视频和讲义。

This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. Recently statistical techniques based on neural networks have achieved a number of remarkable successes in natural language processing leading to a great deal of commercial and academic interest in the field

This is an applied course focussing on recent advances in analysing and generating speech and text using recurrent neural networks. We introduce the mathematical definitions of the relevant machine learning models and derive their associated optimisation algorithms. The course covers a range of applications of neural networks in NLP including analysing latent dimensions in text, transcribing speech to text, translating between languages, and answering questions. These topics are organised into three high level themes forming a progression from understanding the use of neural networks for sequential language modelling, to understanding their use as conditional language models for transduction tasks, and finally to approaches employing these techniques in combination with other mechanisms for advanced applications. Throughout the course the practical implementation of such models on CPU and GPU hardware is also discussed.

This course is organised by Phil Blunsom and delivered in partnership with the DeepMind Natural Language Research Group.

讲义:

  1. Lecture 1a - Introduction [Phil Blunsom]

  2. Lecture 1b - Deep Neural Networks Are Our Friends [Wang Ling]

  3. Lecture 2a- Word Level Semantics [Ed Grefenstette]

  4. Lecture 2b - Overview of the Practicals [Chris Dyer]

  5. Lecture 3 - Language Modelling and RNNs Part 1 [Phil Blunsom]

  6. Lecture 4 - Language Modelling and RNNs Part 2 [Phil Blunsom]

  7. Lecture 5 - Text Classification [Karl Moritz Hermann]

  8. Lecture 6 - Deep NLP on Nvidia GPUs [Jeremy Appleyard]

  9. Lecture 7 - Conditional Language Models [Chris Dyer]

  10. Lecture 8 - Generating Language with Attention [Chris Dyer]

  11. Lecture 9 - Speech Recognition (ASR) [Andrew Senior]

  12. Lecture 10 - Text to Speech (TTS) [Andrew Senior]

  13. Lecture 11 - Question Answering [Karl Moritz Hermann]

  14. Lecture 12 - Memory [Ed Grefenstette]

  15. Lecture 13 - Linguistic Knowledge in Neural Networks


练习:

  1. Practical 1: word2vec







请到「今天看啥」查看全文