Towards crowd density-aware video surveillance applications

Fradi, Hajer; Dugelay, Jean-Luc
Information Fusion, September 2014, ISSN: 1566-2535

Crowd density analysis is a crucial component in visual surveillance mainly for security monitoring. This paper proposes a novel approach for crowd density measure, in which local information at pixel level substitutes a global crowd level or a number of people per-frame. The proposed approach consists of generating automatic crowd density maps using local features as an observation of a probabilistic density function. It also involves a feature tracking step which excludes feature points belonging to the background. This process is favorable for the later density estimation as the influence of features irrelevant to the underlying crowd density is removed. Since the proposed crowd density conveys rich information about the local distributions of persons in the scene, we employ it as a side information to complement other tasks related to video surveillance in crowded scenes. First, since conventional detection and tracking methods are hard to be scalable to crowds, we use the proposed crowd density to enhance detection and tracking in videos of high density crowds. Second, we employ the local density together with regular motion patterns as crowd attributes for high level applications such as crowd change detection and event recognition. Third, we investigate the concept of crowd context-aware privacy protection by adjusting the obfuscation level according to the crowd density. In the experimental results, our proposed approach for crowd density estimation is evaluated on videos from different datasets, and the results demonstrate the effectiveness of feature tracks for crowd measurements. Moreover, the employment of crowd density in other applications demonstrate good performances for detection, tracking, behavior analysis, and privacy preservation.

Digital Security
Eurecom Ref:
© Elsevier. Personal use of this material is permitted. The definitive version of this paper was published in Information Fusion, September 2014, ISSN: 1566-2535 and is available at :