Model-based incident detection system with motion classification

التفاصيل البيبلوغرافية
العنوان: Model-based incident detection system with motion classification
Patent Number: 6,985,172
تاريخ النشر: January 10, 2006
Appl. No: 10/056665
Application Filed: January 25, 2002
مستخلص: Surveillance apparatus and methods effective for detecting incidents are based upon improved image processing techniques applied to infrared and visible light spectrum images in a time sequence. A reference image is generated by removing motion objects from an image of a region of interest. The reference image is compared to an image to be analyzed to detect motion. Detected motion is analyzed and classified as motion objects. The motion objects are then compared to a motion model, which classifies the objects as being anticipated or unanticipated.
Inventors: Rigney, Michael P. (San Antonio, TX, US); Magee, Michael J. (Rio Medina, TX, US); Franke, Ernest A. (San Antonio, TX, US)
Assignees: Southwest Research Institute (San Antonio, TX, US)
Claim: 1. A method of using video images to monitor incidents in a region of interest, comprising the steps of: computing a reference image from a set of images by removing moving objects from the set of images; storing a motion model representing anticipated motion in the region of interest; acquiring a image to be analyzed; computing a temporal difference image by comparing the image to be analyzed with the reference image; repeating the step of computing a temporal difference image to obtain a set of temporal difference images; calculating, from the set of temporal difference images, at least one temporal difference statistic for each pixel; detecting motion in a temporal difference image by separating motion pixels from background in the temporal difference image, using the temporal difference statistic; grouping motion pixels into motion objects; extracting features from the objects; comparing the features of motion objects to the motion model; wherein the motion model classifies objects as anticipated or not anticipated; and wherein the motion model further classifies anticipated objects as being of interest or not of interest.
Claim: 2. The method of claim 1 , wherein the step of computing a reference image is performed by calculating a median pixel value for each pixel in an image sequence, the image sequence comprising at least a first image and a second image, and by removing moving objects from the image sequence by replacing each pixel value in at least one image of the image sequence with the median pixel value.
Claim: 3. The method of claim 1 , wherein the step of computing a reference image is performed by determining a minimum pixel value for each pixel in an image sequence, the image sequence comprising at least a first image and a second image, and by removing moving objects from the image sequence by replacing each pixel value in at least one image of the image sequence with the minimum pixel value.
Claim: 4. The method of claim 1 , wherein the step of computing a reference image is performed by determining a maximum pixel value for each pixel in an image sequence, the image sequence comprising at least a first image and a second image, and by removing moving objects from the image sequence by replacing each pixel value in at least one image of the image sequence with the maximum pixel value.
Claim: 5. The method of claim 1 , wherein the step of computing a temporal difference image is performed by subtracting pixel values in the image to be analyzed from pixel values of the reference image.
Claim: 6. The method of claim 1 , wherein the step of computing a temporal difference image is performed by subtracting pixel values in the reference image from pixel values of the image to be analyzed.
Claim: 7. The method of claim 1 , wherein the step of computing a temporal difference image is performed by calculating the absolute value of the difference between pixel values in the image to be analyzed and pixel values of the reference image.
Claim: 8. The method of claim 1 , wherein the temporal difference statistic is an nth percentile statistic.
Claim: 9. The method of claim 1 , wherein the temporal difference statistic is a mean statistic.
Claim: 10. The method of claim 1 , wherein the temporal difference statistic is a standard deviation statistic.
Claim: 11. The method of claim 1 , wherein the step of detecting motion is performed by using the temporal difference statistic of each pixel to determine a threshold and by comparing the temporal difference image with the threshold pixel values.
Claim: 12. The method of claim 11 , wherein the threshold is based on an nth percentile statistic.
Claim: 13. The method of claim 11 , wherein the threshold is based on a standard deviation statistic.
Claim: 14. The method of claim 1 , wherein the classifying step is performed by extracting spatial features.
Claim: 15. The method of claim 1 , wherein the classifying step is performed by extracting temporal features.
Claim: 16. The method of claim 1 , further comprising the step of generating a response signal in response to the classifying step.
Current U.S. Class: 348/149
Patent References Cited: 4317117 February 1982 Chasek
4692806 September 1987 Anderson et al.
4847772 July 1989 Michalopoulos et al.
5001558 March 1991 Burley et al.
5034986 July 1991 Karmann et al.
5063603 November 1991 Burt
5109435 April 1992 Lo et al.
5161107 November 1992 Mayeaux et al.
5265172 November 1993 Markandey et al.
5280530 January 1994 Trew et al.
5283573 February 1994 Takatou et al.
5285273 February 1994 James et al.
5296852 March 1994 Rathi
5301239 April 1994 Toyama et al.
5317311 May 1994 Martell et al.
5375058 December 1994 Bass
5396429 March 1995 Hanchett
5402118 March 1995 Aoki
5416711 May 1995 Gran et al.
5448484 September 1995 Bullock et al.
5463698 October 1995 Meyer
5512942 April 1996 Otsuki
5557543 September 1996 Parsons
5559496 September 1996 Dubats
Assistant Examiner: Rekstad, Erick
Primary Examiner: Dastouri, Mehrdad
Attorney, Agent or Firm: Baker Botts L.L.P.
رقم الانضمام: edspgr.06985172
قاعدة البيانات: USPTO Patent Grants