专栏名称: 机器学习研究会
机器学习研究会是北京大学大数据与机器学习创新中心旗下的学生组织,旨在构建一个机器学习从事者交流的平台。除了及时分享领域资讯外,协会还会举办各种业界巨头/学术神牛讲座、学术大牛沙龙分享会、real data 创新竞赛等活动。
目录
相关文章推荐
爱可可-爱生活  ·  【PySpur:一款专注于推理计算流水线的集 ... ·  2 天前  
爱可可-爱生活  ·  【Boltz-1:开源的生物分子相互作用模型 ... ·  5 天前  
宝玉xp  ·  回复@有很多os诶:对,Claude的Art ... ·  6 天前  
爱可可-爱生活  ·  本文提出一种新的非对抗性逆强化学习方法SFM ... ·  6 天前  
51好读  ›  专栏  ›  机器学习研究会

【论文】深度随机配置网络:万局逼近与学习表示

机器学习研究会  · 公众号  · AI  · 2017-02-21 18:48

正文


点击上方“机器学习研究会”可以订阅哦
摘要
 

作者:Dianhui Wang & Ming Li

论文《Deep Stochastic Configuration Networks: Universal Approximation and Learning Representation》摘要:

This paper focuses on the development of randomized approaches for building deep neural networks. A supervisory mechanism is proposed to constrain the random assignment of the hidden parameters (i.e., all biases and weights within the hidden layers). Full-rank oriented criterion is suggested and utilized as a termination condition to determine the number of nodes for each hidden layer, and a pre-defined error tolerance is used as a global indicator to decide the depth of the learner model. The read-out weights attached with all direct links from each hidden layer to the output layer are incrementally evaluated by the least squares method. Such a class of randomized leaner models with deep architecture is termed as deep stochastic configuration networks (DeepSCNs), of which the universal approximation property is verified with rigorous proof. Given abundant samples from a continuous distribution, DeepSCNs can speedily produce a learning representation, that is, a collection of random basis functions with the cascaded inputs together with the read-out weights. Simulation results with comparisons on function approximation align with the theoretical findings.


原文链接:

https://www.researchgate.net/publication/313856695_Deep_Stochastic_Configuration_Networks_Universal_Approximation_and_Learning_Representation?isFromSharing=1

“完整内容”请点击【阅读原文】
↓↓↓