NCHU Institution Repositoryhttps://ir.lib.nchu.edu.twDSpace 數位典藏系統是用來獲取、儲存、索引、散佈數位研究資料。Sun, 28 May 2023 18:43:56 GMT2023-05-28T18:43:56Z5011A Limitation of Gradient Descent Learninghttp://hdl.handle.net/11455/100478標題: A Limitation of Gradient Descent Learning
作者: John Sum; Chi-Sing Leung; Kevin Ho
摘要: Over decades, gradient descent has been applied to develop learning algorithm to train a neural network (NN). In this brief, a limitation of applying such algorithm to train an NN with persistent weight noise is revealed. Let V(w) be the performance measure of an ideal NN. V(w) is applied to develop the gradient descent learning (GDL). With weight noise, the desired performance measure (denoted as J(w) ) is E[V(~w)|w] , where ~w is the noisy weight vector. Applying GDL to train an NN with weight noise, the actual learning objective is clearly not V(w) but another scalar function L(w) . For decades, there is a misconception that L(w) = J(w) , and hence, the actual model attained by the GDL is the desired model. However, we show that it might not: 1) with persistent additive weight noise, the actual model attained is the desired model as L(w) = J(w) ; and 2) with persistent multiplicative weight noise, the actual model attained is unlikely the desired model as L(w) ≠ J(w) . Accordingly, the properties of the models attained as compared with the desired models are analyzed and the learning curves are sketched. Simulation results on 1) a simple regression problem and 2) the MNIST handwritten digit recognition are presented to support our claims.
http://hdl.handle.net/11455/100478