Header menu link for other important links
X
Meta-consolidation for continual learning
Published in Neural information processing systems foundation
2020
Volume: 2020-December
   
Abstract
The ability to continuously learn and adapt itself to new tasks, without losing grasp of already acquired knowledge is a hallmark of biological learning systems, which current deep learning systems fall short of. In this work, we present a novel methodology for continual learning called MERLIN: Meta-Consolidation for Continual Learning. We assume that weights of a neural network ?, for solving task t, come from a meta-distribution p(?|t). This meta-distribution is learned and consolidated incrementally. We operate in the challenging online continual learning setting, where a data point is seen by the model only once. Our experiments with continual learning benchmarks of MNIST, CIFAR-10, CIFAR-100 and Mini-ImageNet datasets show consistent improvement over five baselines, including a recent state-of-the-art, corroborating the promise of MERLIN. © 2020 Neural information processing systems foundation. All rights reserved.
About the journal
JournalAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
ISSN10495258