Accurate Single-Stream Action Detection in Real-Time

Affiliation auteurs!!!! Error affiliation !!!!
TitreAccurate Single-Stream Action Detection in Real-Time
Type de publicationConference Paper
Year of Publication2019
AuteursLiu Y, Yang F, Ginhac D
Conference NameICDSC 2019: 13TH INTERNATIONAL CONFERENCE ON DISTRIBUTED SMART CAMERAS
PublisherUniv Trento, Dept Informat Engn & Comp Sci
Conference Location1515 BROADWAY, NEW YORK, NY 10036-9998 USA
ISBN Number978-1-4503-7189-6
Mots-clésaction detection, convolutional neural network, Embedded System, video analytics
Résumé

Analyzing videos of human actions involves understanding the spatial and temporal context of the scenes. State-of-the-art action detection approaches have demonstrated impressive results using Convolutional Neural Networks (CNNs) within a two-stream framework. However, most of them operate in a non-real-time, offline fashion, thus are not well-equipped in many emerging real-world scenarios such as autonomous driving and public surveillance. In addition, they are computationally demanding to be deployed on devices with limited power resources (e.g., embedded systems). To address the above challenges, we propose an efficient single-stream action detection framework by exploiting temporal coherence between successive video frames. This allows CNN appearance features to be cheaply propagated by motions rather than being extracted from every frame. Furthermore, we utilize an implicit motion representation to amplify appearance features. Our method based on motion-guided and motion-aware appearance features is evaluated on the UCF-101-24 dataset. Experiments indicate that the proposed method can achieve real-time action detection up to 32 fps with a comparable accuracy as the two-stream approach.

DOI10.1145/3349801.3349821