Title: Application of l 1 Estimation of Gaussian Mixture Model Parameters for Language Identification
Abstract: In this paper we explore the using of l 1 optimization for a parameter estimation of Gaussian mixture models (GMM) applied to the language identification. To train the Universal background model (UBM) at each step of Expectation maximization (EM) algorithm the problem of the GMM means estimation is stated as l 1 optimization. The approach is Iteratively reweighted least squares (IRLS). Also here is represented the corresponding solution of the Maximum a posteriori probability (MAP) adaptation. The results of the above UBM-MAP system combined with Support vector machine (SVM) are reported on the LDC and GlobalPhone datasets.
Publication Year: 2013
Publication Date: 2013-01-01
Language: en
Type: book-chapter
Indexed In: ['crossref']
Access and Citation
Cited By Count: 1
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot