Effective and efficient human action recognition using dynamic frame skipping and trajectory rejection

Cited 23 time in webofscience Cited 0 time in scopus
  • Hit : 697
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorSeo, Jeong-Jikko
dc.contributor.authorKim, Hyung-Ilko
dc.contributor.authorDe Neve, Wesleyko
dc.contributor.authorRo, Yong Manko
dc.date.accessioned2017-04-17T07:41:18Z-
dc.date.available2017-04-17T07:41:18Z-
dc.date.created2016-06-21-
dc.date.created2016-06-21-
dc.date.created2016-06-21-
dc.date.issued2017-02-
dc.identifier.citationIMAGE AND VISION COMPUTING, v.58, pp.76 - 85-
dc.identifier.issn0262-8856-
dc.identifier.urihttp://hdl.handle.net/10203/223329-
dc.description.abstractHuman action recognition (HAR) is a core technology for human-computer interaction and video understanding, attracting significant research and development attention in the field of computer vision. However, in uncontrolled environments, achieving effective HAR is still challenging, due to the widely varying nature of video content. In previous research efforts, trajectory-based video representations have been widely used for HAR. Although these approaches show state-of-the-art HAR performance for various datasets, issues like a high computational complexity and the presence of redundant trajectories still need to be addressed in order to solve the problem of real-world HAR. In this paper, we propose a novel method for HAR, integrating a technique for rejecting redundant trajectories that are mainly originating from camera movement, without degrading the effectiveness of HAR. Furthermore, in order to facilitate efficient optical flow estimation prior to trajectory extraction, we integrate a technique for dynamic frame skipping. As a result, we only make use of a small subset of the frames present in a video clip for optical flow estimation. Comparative experiments with five publicly available human action datasets show that the proposed method outperforms state-of-the-art HAR approaches in terms of effectiveness, while simultaneously mitigating the computational complexity. (C) 2016 Elsevier B.V. All rights reserved.-
dc.languageEnglish-
dc.publisherELSEVIER SCIENCE BV-
dc.titleEffective and efficient human action recognition using dynamic frame skipping and trajectory rejection-
dc.typeArticle-
dc.identifier.wosid000395844700008-
dc.identifier.scopusid2-s2.0-85003601790-
dc.type.rimsART-
dc.citation.volume58-
dc.citation.beginningpage76-
dc.citation.endingpage85-
dc.citation.publicationnameIMAGE AND VISION COMPUTING-
dc.identifier.doi10.1016/j.imavis.2016.06.002-
dc.contributor.localauthorRo, Yong Man-
dc.contributor.nonIdAuthorSeo, Jeong-Jik-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorFrame skipping-
dc.subject.keywordAuthorHuman action recognition (HAR)-
dc.subject.keywordAuthorMotion descriptor-
dc.subject.keywordAuthorMotion trajectory-
dc.subject.keywordAuthorOptical flow-
dc.subject.keywordPlusREPRESENTATION-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 23 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0