Gianfranco Doretto / Publications

Joint recognition of complex events and track matching

Chan, M. T., Hoogs, A., Bhotika, R., Perera, A., Schmiederer, J., and Doretto, G.
Joint recognition of complex events and track matching
In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1615–1622, New York City, NY, USA, June 2006.

Download

PDF (958.4kB )  

Abstract

We present a novel method for jointly performing recognition of complex events and linking fragmented tracks into coherent, long-duration tracks. Many event recognition methods require highly accurate tracking, and may fail when tracks corresponding to event actors are fragmented or partially missing. However, these conditions occur frequently from occlusions, traffic and tracking errors. Recently, methods have been proposed for linking track fragments from multiple objects under these difficult conditions. Here, we develop a method for solving these two problems jointly. A hypothesized event model, represented as a Dynamic Bayes Net, supplies data-driven constraints on the likelihood of proposed track fragment matches. These event-guided constraints are combined with appearance and kinematic constraints used in the previous track linking formulation. The result is the most likely track linking solution given the event model, and the highest event score given all of the track fragments. The event model with the highest score is determined to have occurred, if the score exceeds a threshold. Results demonstrated on a busy scene of airplane servicing activities, where many non-event movers and long fragmented tracks are present, show the promise of the approach to solving the joint problem.

BibTeX

@InProceedings{chanHBPSD06cvpr,
  Title                    = {Joint recognition of complex events and track matching},
  Author                   = {Chan, M. T. and Hoogs, A. and Bhotika, R. and Perera, A. and Schmiederer, J. and Doretto, G.},
  Booktitle                = cvpr,
  Year                     = {2006},
  Address                  = {New York City, NY, USA},
  Month                    = jun,
  Pages                    = {1615--1622},
  Volume                   = {2},
  Abstract                 = {We present a novel method for jointly performing recognition of complex events and linking fragmented tracks into coherent, long-duration tracks. Many event recognition methods require highly accurate tracking, and may fail when tracks corresponding to event actors are fragmented or partially missing. However, these conditions occur frequently from occlusions, traffic and tracking errors. Recently, methods have been proposed for linking track fragments from multiple objects under these difficult conditions. Here, we develop a method for solving these two problems jointly. A hypothesized event model, represented as a Dynamic Bayes Net, supplies data-driven constraints on the likelihood of proposed track fragment matches. These event-guided constraints are combined with appearance and kinematic constraints used in the previous track linking formulation. The result is the most likely track linking solution given the event model, and the highest event score given all of the track fragments. The event model with the highest score is determined to have occurred, if the score exceeds a threshold. Results demonstrated on a busy scene of airplane servicing activities, where many non-event movers and long fragmented tracks are present, show the promise of the approach to solving the joint problem.},
  Bib2html_pubtype         = {Refereed Conferences},
  Bib2html_rescat          = {Video Surveillance, Event Recognition, Track Matching},
  Doi                      = {10.1109/CVPR.2006.160},
  File                     = {chanHBPSD06cvpr.pdf:doretto\\conference\\chanHBPSD06cvpr.pdf:PDF;chanHBPSD06cvpr.pdf:doretto\\conference\\chanHBPSD06cvpr.pdf:PDF},
  ISSN                     = {1063-6919},
  Owner                    = {doretto},
  Timestamp                = {2006.11.29}
}