Action-02MCF: A robust space-time correlation filter for action recognition in clutter and adverse lighting conditions

Anwaar Ul-Haq, Xiaoxia Yin, Yunchan Zhang, Iqbal Gondal

Research output: Book chapter/Published conference paperChapter (peer-reviewed)peer-review

2 Citations (Scopus)


Human actions are spatio-temporal visual events and recognizing human actions in different conditions is still a challenging computer vision problem. In this paper, we introduce a robust feature based space-time correlation filter, called Action-02MCF (0’zero-aliasing’ 2M’ Maximum Margin’) for recognizing human actions in video sequences. This filter combines (i) the sparsity of spatio-temporal feature space, (ii) generalization of maximum margin criteria, (iii) enhanced aliasing free localization performance of correlation filtering using (iv) rich context of maximally stable space-time interest points into a single classifier. Its rich multi-objective function provides robustness, generalization and recognition as a single package. Action-02MCF can simultaneously localize and classify actions of interest even in clutter and adverse imaging conditions. We evaluate the performance of our proposed filter for challenging human action datasets. Experimental results verify the performance potential of our action-filter compared to other correlation filtering based action recognition approaches.
Original languageEnglish
Title of host publicationAdvanced concepts for intelligent vision systems
EditorsJacques Blanc-Talon, Cosimo Distante, Wilfried Philips, Dan Popescu, Paul Scheunders
Place of PublicationCham, Switzerland
Number of pages12
ISBN (Electronic)9783319486802
ISBN (Print)9783319486796
Publication statusPublished - 2016

Publication series

NameLecture notes in computer science
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Dive into the research topics of 'Action-02MCF: A robust space-time correlation filter for action recognition in clutter and adverse lighting conditions'. Together they form a unique fingerprint.

Cite this