专栏名称: 机器学习研究会
机器学习研究会是北京大学大数据与机器学习创新中心旗下的学生组织,旨在构建一个机器学习从事者交流的平台。除了及时分享领域资讯外,协会还会举办各种业界巨头/学术神牛讲座、学术大牛沙龙分享会、real data 创新竞赛等活动。
目录
相关文章推荐
人工智能那点事  ·  登山助力机器人在泰山试用!网友:感觉我又行了 ·  昨天  
爱可可-爱生活  ·  【[70星]Extract-chat:通过聊 ... ·  昨天  
量子位  ·  国产AI搜索接入DeepSeek-R1,深度 ... ·  昨天  
爱可可-爱生活  ·  【[2.6k星]txiki.js:超轻量级的 ... ·  3 天前  
51好读  ›  专栏  ›  机器学习研究会

【推荐】(论文+代码)收敛速度更快更稳定的Wasserstein GAN(WGAN)

机器学习研究会  · 公众号  · AI  · 2017-04-06 19:11

正文



点击上方“机器学习研究会”可以订阅哦
摘要
 

转自:爱可可-爱生活

论文《Improved Training of Wasserstein GANs》摘要:

Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability. The recently proposed Wasserstein GAN (WGAN) makes significant progress toward stable training of GANs, but can still generate low-quality samples or fail to converge in some settings. We find that these training failures are often due to the use of weight clipping in WGAN to enforce a Lipschitz constraint on the critic, which can lead to pathological behavior. We propose an alternative method for enforcing the Lipschitz constraint: instead of clipping weights, penalize the norm of the gradient of the critic with respect to its input. Our proposed method converges faster and generates higher-quality samples than WGAN with weight clipping. Finally, our method enables very stable GAN training: for the first time, we can train a wide variety of GAN architectures with almost no hyperparameter tuning, including 101-layer ResNets and language models over discrete data.


论文链接:

https://arxiv.org/abs/1704.00028


代码链接:

https://github.com/igul222/improved_wgan_training


原文链接:

http://weibo.com/1402400261/EDdojryZI?from=page_1005051402400261_profile&wvr=6&mod=weibotime&type=comment

“完整内容”请点击【阅读原文】
↓↓↓