Title: Stochastic Meta Descent in online kernel methods
Abstract: Learning system is a method to approximate an underlying function from a finite observation data. Since batch learning has a disadvantage in dealing with large data set, online learning is proposed to prevent the computational expensive. Iterative method called Stochastic Gradient Descent (SGD) is applied to solve for the underlying function on reproducing kernel Hilbert spaces (RKHSs). To use SGD in time-varying environment, a learning rate is adjusted by Stochastic Meta Descent (SMD). The simulation results show that SMD can follow shifting and switching target function whereas the size of model can be restricted using sparse solution.
Publication Year: 2009
Publication Date: 2009-05-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 4
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot