Header menu link for other important links
X
On the algorithmics and applications of a mixed-norm based kernel learning formulation
, G. Dinesh, S. Raman, C. Bhattacharyya, A. Ben-Tal, K.R. Ramakrishnan
Published in Neural Information Processing Systems
2009
Pages: 844 - 852
Abstract
Motivated from real world problems, like object categorization, we study a particular mixed-norm regularization for Multiple Kernel Learning (MKL). It is assumed that the given set of kernels are grouped into distinct components where each component is crucial for the learning task at hand. The formulation hence employs l ∞ regularization for promoting combinations at the component level and l 1 regularization for promoting sparsity among kernels in each component. While previous attempts have formulated this as a non-convex problem, the formulation given here is an instance of non-smooth convex optimization problem which admits an efficient Mirror-Descent (MD) based procedure. The MD procedure optimizes over product of simplexes, which is not a well-studied case in literature. Results on real-world datasets show that the new MKL formulation is well-suited for object categorization tasks and that the MD based algorithm outperforms state-of-the-art MKL solvers like simpleMKL in terms of computational effort.
About the journal
JournalAdvances in Neural Information Processing Systems 22 - Proceedings of the 2009 Conference
PublisherNeural Information Processing Systems
Open AccessNo