Abstract

As a high-order neural network, the Pi-sigma neural network has demonstrated its capacities for fast learning and strong nonlinear processing. In this paper, a new algorithm is proposed for Pi-sigma neural networks with entropy error functions based on L regularization. One of the key features of the proposed algorithm is the use of an entropy error function instead of the more common square error function, which is different from those in most existing literature. At the same time, the proposed algorithm also employs L regularization as a means of ensuring the efficiency of the network. Based on the gradient method, the monotonicity, and strong and weak convergence of the network are strictly proved by theoretical analysis and experimental verification. Experiments on applying the proposed algorithm to both classification and regression problems have demonstrated the improved performance of the algorithm.

Original languageEnglish
Pages (from-to)4405-4416
Number of pages12
JournalInternational Journal of Machine Learning and Cybernetics
Volume14
Issue number12
Early online date12 Jul 2023
DOIs
Publication statusPublished - Dec 2023

Fingerprint

Dive into the research topics of 'Convergence analysis for sparse Pi-sigma neural network model with entropy error function'. Together they form a unique fingerprint.

Cite this