Abstract:
In order to solve overfitting of modeling in noisy circumstance, a novel cost function with corresponding training algorithm is proposed for wavelet networks based on sampling theory. Since such an algorithm can use sample distributions and errors respectively to train input and output weights, learning efficiencies of wavelet networks are improved greatly. The theories and experiments show that this novel cost function can ensure generalizations of wavelet networks. Simultaneously, the new algorithm can converge globally and is robust to noise varying.