专栏名称: 机器学习研究会
机器学习研究会是北京大学大数据与机器学习创新中心旗下的学生组织,旨在构建一个机器学习从事者交流的平台。除了及时分享领域资讯外,协会还会举办各种业界巨头/学术神牛讲座、学术大牛沙龙分享会、real data 创新竞赛等活动。
目录
相关文章推荐
爱可可-爱生活  ·  【HOPEJr:开源DIY人形机器人,拥有灵 ... ·  2 天前  
黄建同学  ·  Anthropic通过Clio平台对Clau ... ·  4 天前  
爱可可-爱生活  ·  【AMD Nitro ... ·  6 天前  
爱可可-爱生活  ·  [CL]《Large Concept ... ·  6 天前  
51好读  ›  专栏  ›  机器学习研究会

【推荐】卷积神经网络中稀疏性的能力

机器学习研究会  · 公众号  · AI  · 2017-02-23 18:57

正文


点击上方“机器学习研究会”可以订阅哦


摘要
 

转自:爱可可-爱生活

论文《The Power of Sparsity in Convolutional Neural Networks》摘要:
Deep convolutional networks are well-known for their high computational and memory demands. Given limited resources, how does one design a network that balances its size, training time, and prediction accuracy? A surprisingly effective approach to trade accuracy for size and speed is to simply reduce the number of channels in each convolutional layer by a fixed fraction and retrain the network. In many cases this leads to significantly smaller networks with only minimal changes to accuracy. In this paper, we take a step further by empirically examining a strategy for deactivating connections between filters in convolutional layers in a way that allows us to harvest savings both in run-time and memory for many network architectures. More specifically, we generalize 2D convolution to use a channel-wise sparse connection structure and show that this leads to significantly better results than the baseline approach for large networks including VGG and Inception V3.

链接:
https://arxiv.org/abs/1702.06257

原文链接:
http://weibo.com/1402400261/EwNMFdxnw?from=page_1005051402400261_profile&wvr=6&mod=weibotime&type=comment#_rnd1487839624567
“完整内容”请点击【阅读原文】
↓↓↓