TY - JOUR
T1 - Convergence analysis for sparse Pi-sigma neural network model with entropy error function
AU - Fan, Qinwei
AU - Zheng, Fengjiao
AU - Huang, Xiaodi
AU - Xu, Dongpo
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.
PY - 2023/12
Y1 - 2023/12
N2 - As a high-order neural network, the Pi-sigma neural network has demonstrated its capacities for fast learning and strong nonlinear processing. In this paper, a new algorithm is proposed for Pi-sigma neural networks with entropy error functions based on L regularization. One of the key features of the proposed algorithm is the use of an entropy error function instead of the more common square error function, which is different from those in most existing literature. At the same time, the proposed algorithm also employs L regularization as a means of ensuring the efficiency of the network. Based on the gradient method, the monotonicity, and strong and weak convergence of the network are strictly proved by theoretical analysis and experimental verification. Experiments on applying the proposed algorithm to both classification and regression problems have demonstrated the improved performance of the algorithm.
AB - As a high-order neural network, the Pi-sigma neural network has demonstrated its capacities for fast learning and strong nonlinear processing. In this paper, a new algorithm is proposed for Pi-sigma neural networks with entropy error functions based on L regularization. One of the key features of the proposed algorithm is the use of an entropy error function instead of the more common square error function, which is different from those in most existing literature. At the same time, the proposed algorithm also employs L regularization as a means of ensuring the efficiency of the network. Based on the gradient method, the monotonicity, and strong and weak convergence of the network are strictly proved by theoretical analysis and experimental verification. Experiments on applying the proposed algorithm to both classification and regression problems have demonstrated the improved performance of the algorithm.
KW - Convergence
KW - Entropy error function
KW - L Regularization
KW - Pi-sigma neural network
UR - http://www.scopus.com/inward/record.url?scp=85164455872&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85164455872&partnerID=8YFLogxK
UR - https://rdcu.be/dwGxK
U2 - 10.1007/s13042-023-01901-x
DO - 10.1007/s13042-023-01901-x
M3 - Article
SN - 1868-808X
VL - 14
SP - 4405
EP - 4416
JO - International Journal of Machine Learning and Cybernetics
JF - International Journal of Machine Learning and Cybernetics
IS - 12
ER -