This paper deals with the problem of adaptation of radial basis function neural networks (RBF NN). A new RBF NN supervised training algorithm is proposed. This method possesses the distinctive properties of Lyapunov Theory-based Adaptive Filtering (LAF) in -. The method is different from many RBF NN training using gradient search methods. A new Lyapunov function of the error between the desired output and the RBF NN output is first defined. The output asymptotically converges to the desired output by designing the adaptation law in Lyapunov sense. Error convergence analysis in this paper has proven that the design of the new RBF NN training algorithm is independent of statistic properties of input and output signals. The new adaptation law has better tracking capability compared with the tracking performance of LAF in -. The performance of the proposed technique is illustrated through the adaptive prediction of nonlinear and nonstationary speech signals.