Monday, February 17, 2014

Kernel LMS algorithm with forward-backward splitting for dictionary learning - implementation -

If you recall the Random Kitchen Sinks, you'll remember that one chooses randomly some of the coefficient and then evaluate the remaining coefficient with a least square approach. Those functionals using random numbers are part of what is called a (nonlinear) dictionary. Here is the state of the art with regards to approximating time series with the RHKSs with nonrandom "frequencies". Maybe one should look into random dictionaries:   

Nonlinear adaptive filtering with kernels has become a topic of high interest over the last decade. A characteristics of kernel-based techNonlinear adaptive filtering with kernels has become a topic of high interest over the last decade. A characteristics of kernel-based techter parameter update stage. It is surprising to note that most existing strategies for dictionary update can only incorporate new elements into the dictionary. This unfortunately means that they cannot discard obsolete kernel functions, within the context of a time-varying environment in particular. Recently, to remedy this drawback, it has been proposed to associate an ℓ1norm regularization criterion with the mean-square error criterion. The aim of this paper is to provide theoretical results on the convergence of this approach.


Matlab implementation is Cedric's page


No comments:

Printfriendly