论文部分内容阅读
针对前馈神经网络的反向传播(BP)学习算法收敛速度慢的缺点,提出了一种新的学习算法即线性化快速学习算法。这种学习算法在神经网络学习的初期,采用标准BP学习算法。而当神经网络接近最优点时,由于此时其连接权重调节幅度很小,因此采用对各层神经元的非线性作用函数进行泰勒级数展开,并取其一阶展开式近似逼近原函数,从而使其非线性作用函数转化为线性作用函数,简化了网络学习过程的计算量,加速了网络的学习速度。文中最后给出了采用线性化算法与标准BP算法对正弦函数的学习过程。
Aiming at the shortcoming of the slow convergence rate of back propagation neural network (BP) learning algorithm, a new learning algorithm called linearized fast learning algorithm is proposed. This learning algorithm in the early stages of neural network learning, using standard BP learning algorithm. When the neural network approaches the optimal point, the Taylor series expansion is performed by using the non-linear function of neurons in each layer, and the first-order expansion approximates the original function. Thus, the nonlinear function is transformed into a linear function, which simplifies the computation of network learning process and accelerates the learning speed of the network. In the end, the learning process of the sine function using the linearization algorithm and the standard BP algorithm is given.