Header menu link for other important links
X
Generalized hierarchical kernel learning
P. Jawanpuria, , G. Ramakrishnan
Published in Microtome Publishing
2015
Volume: 16
   
Pages: 617 - 652
Abstract
This paper generalizes the framework of Hierarchical Kernel Learning (HKL) and illustrates its utility in the domain of rule learning. HKL involves Multiple Kernel Learning over a set of given base kernels assumed to be embedded on a directed acyclic graph. This paper proposes a two-fold generalization of HKL: the first is employing a generic ℓ1/ℓp block-norm regularizer (ρ ∈ (1, 2]) that alleviates a key limitation of the HKL formulation. The second is a generalization to the case of multi-class, multi-label and more generally, multi-task applications. The main technical contribution of this work is the derivation of a highly specialized partial dual of the proposed generalized HKL formulation and an efficient mirror descent based active set algorithm for solving it. Importantly, the generic regularizer enables the proposed formulation to be employed in the Rule Ensemble Learning (REL) where the goal is to construct an ensemble of conjunctive propositional rules. Experiments on benchmark REL data sets illustrate the efficacy of the proposed generalizations. ©2015 Pratik Jawanpuria, Jagarlapudi Saketha Nath and Ganesh Ramakrishnan.
About the journal
JournalData powered by TypesetJournal of Machine Learning Research
PublisherData powered by TypesetMicrotome Publishing
ISSN15324435