最新Oxford和DeepMind的面向自然语言处理的深度学习课程,包括课程视频、讲义和练习,点击“阅读原文”即可进入百度网盘下载视频和讲义。
This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. Recently statistical techniques based on neural networks have achieved a number of remarkable successes in natural language processing leading to a great deal of commercial and academic interest in the field
This is an applied course focussing on recent advances in analysing and generating speech and text using recurrent neural networks. We introduce the mathematical definitions of the relevant machine learning models and derive their associated optimisation algorithms. The course covers a range of applications of neural networks in NLP including analysing latent dimensions in text, transcribing speech to text, translating between languages, and answering questions. These topics are organised into three high level themes forming a progression from understanding the use of neural networks for sequential language modelling, to understanding their use as conditional language models for transduction tasks, and finally to approaches employing these techniques in combination with other mechanisms for advanced applications. Throughout the course the practical implementation of such models on CPU and GPU hardware is also discussed.
This course is organised by Phil Blunsom and delivered in partnership with the DeepMind Natural Language Research Group.
讲义:
-
Lecture 1a - Introduction [Phil Blunsom]
-
Lecture 1b - Deep Neural Networks Are Our Friends [Wang Ling]
-
Lecture 2a- Word Level Semantics [Ed Grefenstette]
-
Lecture 2b - Overview of the Practicals [Chris Dyer]
-
Lecture 3 - Language Modelling and RNNs Part 1 [Phil Blunsom]
-
Lecture 4 - Language Modelling and RNNs Part 2 [Phil Blunsom]
-
Lecture 5 - Text Classification [Karl Moritz Hermann]
-
Lecture 6 - Deep NLP on Nvidia GPUs [Jeremy Appleyard]
-
Lecture 7 - Conditional Language Models [Chris Dyer]
-
Lecture 8 - Generating Language with Attention [Chris Dyer]
-
Lecture 9 - Speech Recognition (ASR) [Andrew Senior]
-
Lecture 10 - Text to Speech (TTS) [Andrew Senior]
-
Lecture 11 - Question Answering [Karl Moritz Hermann]
-
Lecture 12 - Memory [Ed Grefenstette]
-
Lecture 13 - Linguistic Knowledge in Neural Networks
练习:
-
Practical 1: word2vec