Title: A novel fast learning algorithms for time-delay neural networks
Abstract: To counter the drawbacks of long training time required by Waibel's time-delay neural networks (TDNN) in phoneme recognition, the paper puts forward several improved fast learning methods for TDNN. Merging the unsupervised Oja rule and the similar error backpropagation algorithm for initial training of TDNN weights can effectively increase the convergence speed. Improving the error energy function and updating the changing of weights according to size of output error, can increase the training speed. From backpropagation along layer, to average overlap part of backpropagation error of the first hidden layer along a frame, the training samples gradually increase the convergence speed increases. For multi-class phonemic modular TDNNs, we improve the architecture of Waibel's modular networks, and obtain an optimum modular TDNNs of tree structure to accelerate its learning. Its training time is less than Waibel's modular TDNNs.
Publication Year: 2003
Publication Date: 2003-01-22
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot