Header menu link for other important links
X
Unsupervised Universal Attribute Modeling for Action Recognition
Published in Institute of Electrical and Electronics Engineers Inc.
2019
Volume: 21
   
Issue: 7
Pages: 1672 - 1680
Abstract
A fixed dimensional representation for action clips of varying lengths has been proposed in the literature using aggregation models like bag-of-words and Fisher vector. These representations are high dimensional and require classification techniques for action recognition. In this paper, we propose a framework for unsupervised extraction of a discriminative low-dimensional representation called action-vector. To start with, local spatio-temporal features are utilized to capture the action attributes implicitly in a large Gaussian mixture model called the universal attribute model (UAM). To enhance the contribution of the significant attributes in each action clip, a maximum aposteriori adaptation of the UAM means is performed for each clip. This results in a concatenated mean vector called super action vector (SAV) for each action clip. However, the SAV is still high dimensional because of the presence of redundant attributes. Hence, we employ factor analysis to represent every SAV only in terms of the few important attributes contributing to the action clip. This leads to a low-dimensional representation called action-vector. This entire procedure requires no class labels and produces action-vectors that are distinct representations of each action irrespective of the inter-actor variability encountered in unconstrained videos. An evaluation on trimmed action datasets UCF101 and HMDB51 demonstrates the efficacy of action-vectors for action classification over state-of-the-art techniques. Moreover, we also show that action-vectors can adequately represent untrimmed videos from the THUMOS14 dataset and produce classification results comparable to existing techniques. © 1999-2012 IEEE.
About the journal
JournalData powered by TypesetIEEE Transactions on Multimedia
PublisherData powered by TypesetInstitute of Electrical and Electronics Engineers Inc.
ISSN15209210