摘要 : A problem with gradient descent algorithms is that they can converge to poorly performing local minima. Global optimization algorithms address this problem, but at the cost of greatly increased training times. This work examines c... 展开
作者 | Treadgold~ N.K. Gedeon~ T.D. |
---|---|
期刊名称 | 《IEEE Transactions on Neural Networks》 |
页码/总页数 | P.662-668 / 7 |
语种/中图分类号 | 英语 / TP |
关键词 | Simulated annealing Backpropagation algorithms Optimization methods Convergence Neural networks Feedforward neural networks Gradient methods Computer networks Cost function Feedforward systems |
DOI | 10.1109/72.701179 |
馆藏号 | IELEP0180 |