The goal of this course is to review currently available theories for deep learning and encourage better theoretical understanding of deep learning algorithms.
Lecture slides for STATS385, Fall 2017
Lecture01: Deep Learning Challenge. Is There Theory? (Donoho/Monajemi/Papyan)
Lecture02: Overview of Deep Learning From a Practical Point of View (Donoho/Monajemi/Papyan)
Lecture03: Harmonic Analysis of Deep Convolutional Neural Networks (Helmut Bolcskei)
Lecture04: Convnets from First Principles: Generative Models, Dynamic Programming & EM (Ankit Patel)
Lecture05: When Can Deep Networks Avoid the Curse of Dimensionality and Other Theoretical Puzzles (Tomaso Poggio)
Lecture06: Views of Deep Networksfrom Reproducing Kernel Hilbert Spaces (Zaid Harchaoui)
Lecture 1 – Deep Learning Challenge. Is There Theory?
Readings
-
Deep Deep Trouble
-
Why 2016 is The Global Tipping Point...
-
Are AI and ML Killing Analyticals...
-
The Dark Secret at The Heart of AI
-
AI Robots Learning Racism...
-
FaceApp Forced to Pull ‘Racist' Filters...
-
Losing a Whole Generation of Young Men to Video Games
Lecture 2 – Overview of Deep Learning From a Practical Point of View
Readings
-
Emergence of simple cell
-
ImageNet Classification with Deep Convolutional Neural Networks (Alexnet)
-
Very Deep Convolutional Networks for Large-Scale Image Recognition (VGG)
-
Going Deeper with Convolutions (GoogLeNet)
-
Deep Residual Learning for Image Recognition (ResNet)
-
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
-
Visualizing and Understanding Convolutional Neural Networks
Blogs
-
An Intuitive Guide to Deep Network Architectures
-
Neural Network Architectures
Videos
-
Deep Visualization Toolbox
Lecture 3
Readings
-
A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
-
Energy Propagation in Deep Convolutional Neural Networks
-
Discrete Deep Feature Extraction: A Theory and New Architectures
-
Topology Reduction in Deep Convolutional Feature Extraction Networks
Lecture 4
Readings
-
A Probabilistic Framework for Deep Learning
-
Semi-Supervised Learning with the Deep Rendering Mixture Model
-
A Probabilistic Theory of Deep Learning
Lecture 5
Readings
-
Why and When Can Deep-but Not Shallow-networks Avoid the Curse of Dimensionality: A Review
-
Learning Functions: When is Deep Better Than Shallow
Lecture 6
Readings
-
Convolutional Patch Representations for Image Retrieval: an Unsupervised Approach
-
Convolutional Kernel Networks
-
Kernel Descriptors for Visual Recognition
-
End-to-End Kernel Learning with Supervised Convolutional Kernel Networks
-
Learning with Kernels
-
Kernel Based Methods for Hypothesis Testing
Lecture 7
Readings
-
Geometry of Neural Network Loss Surfaces via Random Matrix Theory
-
Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
-
Nonlinear random matrix theory for deep learning
Lecture 8
Readings
-
Deep Learning without Poor Local Minima
-
Topology and Geometry of Half-Rectified Network Optimization
-
Convexified Convolutional Neural Networks
-
Implicit Regularization in Matrix Factorization
Lecture 9
Readings