当前位置: X-MOL 学术arXiv.cs.CC › 论文详情
Training Convolutional ReLU Neural Networks in Polynomial Time: Exact Convex Optimization Formulations
arXiv - CS - Computational Complexity Pub Date : 2020-06-26 , DOI: arxiv-2006.14798
Tolga Ergen; Mert Pilanci

We study training of Convolutional Neural Networks (CNNs) with ReLU activations and introduce exact convex optimization formulations with a polynomial complexity with respect to the number of data samples, the number of neurons and data dimension. Particularly, we develop a convex analytic framework utilizing semi-infinite duality to obtain equivalent convex optimization problems for several CNN architectures. We first prove that two-layer CNNs can be globally optimized via an $\ell_2$ norm regularized convex program. We then show that certain three-layer CNN training problems are equivalent to an $\ell_1$ regularized convex program. We also extend these results to multi-layer CNN architectures. Furthermore, we present extensions of our approach to different pooling methods.
更新日期:2020-06-29

 

全部期刊列表>>
材料学研究精选
Springer Nature Live 产业与创新线上学术论坛
胸腔和胸部成像专题
自然科研论文编辑服务
ACS ES&T Engineering
ACS ES&T Water
屿渡论文,编辑服务
杨超勇
周一歌
华东师范大学
南京工业大学
清华大学
中科大
唐勇
跟Nature、Science文章学绘图
隐藏1h前已浏览文章
中洪博元
课题组网站
新版X-MOL期刊搜索和高级搜索功能介绍
ACS材料视界
x-mol收录
福州大学
南京大学
王杰
左智伟
湖南大学
清华大学
吴杰
赵延川
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug