论文部分内容阅读
优化神经网络参数是一个非常重要的问题,不同的算法得到不同的性能。研究前馈神经网络(FFNN)参数优化问题,基于变量投影算法(VP),采用SVD分解,应用可分离变量非线性最小二乘,不仅降低了参数空间的维数,而且简化了搜索空间的拓扑结构,从而提高前馈神经网络参数优化的性能。一种著名的Mackey-Glass混沌时间序列被用来检验算法的性能,实验结果表明基于SVD分解的可分离变量非线性最小二乘算法的收敛速度和预测精度要优于未分离的非线性最小二乘算法。
Optimization of neural network parameters is a very important issue, different algorithms have different performance. This paper studies the parameter optimization of Feedforward Neural Network (FFNN) based on variable projection algorithm (VP), using SVD decomposition and applying separable variable nonlinear least squares, which not only reduces the dimensionality of parameter space but also simplifies the search space topology Structure, so as to improve the performance of feedforward neural network parameter optimization. A famous Mackey-Glass chaotic time series is used to test the performance of the algorithm. The experimental results show that the nonlinear Least Squares algorithm based on SVD decomposition has better convergence rate and prediction accuracy than the non-linear least-squares Multiply algorithm.