Full text loading...
-
oa Real-time multiple moving vehicle detection and tracking framework for autonomous UAV monitoring of urban traffic
- الناشر: Hamad bin Khalifa University Press (HBKU Press)
- المصدر: Qatar Foundation Annual Research Forum Proceedings, Qatar Foundation Annual Research Forum Volume 2013 Issue 1, نوفمبر ٢٠١٣, المجلد 2013, ICTP-013
ملخص
Unmanned Aerial Vehicles (UAVs) have the potential to provide comprehensive information for traffic monitoring, road conditions and emergency response. However, to enable autonomous UAV operations, video images captured by UAV cameras are processed by using state-of-the art algorithms for vehicle detection, recognition, and tracking. Nevertheless, processing of aerial UAV images is challenging due to the fact that the images are usually captured with low-resolution cameras, from high altitudes, and the UAV is in continuous motion. The latter enforces the need for decoupling the camera and scene motions and most of the techniques for moving vehicle detection perform ego motion compensation to separate camera motion from scene motion. To this end, registration of successive image frames is first carried out to match two or more images of the same scene taken at different times followed by moving vehicle labeling. Detected vehicles of interest are routinely tracked by the UAV. However, vehicle tracking in UAV imagery is challenging due to constantly changing camera vantage points, changes in illumination, and occlusions. The majority of the existing vehicle detection and tracking techniques suffer from reduced accuracy and/or entail intensive processing that prohibits their deployment onboard of UAVs unless intensive computational resources are utilized. This paper presents a novel multiple moving vehicle detection and tracking framework that is suitable for UAV traffic monitoring application. The proposed framework executes in real-time with improved accuracy and is based on image feature processing and projective geometry. FAST image features are first extracted and then outlier features are computed by using least median square estimation. Moving vehicles are subsequently detected with density-based spatial clustering algorithm. Vehicles are tracked by using Kalman filtering while an overlap-rate-based data association mechanism followed by tracking persistency check are used to discriminate between true moving vehicles and false detections. The proposed framework doesn't involve the explicit application of image transformations, i.e. warping, to detect potential moving vehicles which reduces computational time and decreases the probability of having wrongly detected vehicles due to registration errors. Furthermore, the use of data association to correlate detected and tracked vehicles along with the selective target's template update that's based on the data association decision, significantly improves the overall tracking accuracy. For quantitative evaluation, a testbed has been implemented to evaluate the proposed framework on three datasets: The standard DARPA Eglin-1 and the RedTeam datasets, and a home-collected dataset. The proposed framework achieves recall rates of 97.1 and 96.8 (average 96.9%), and precision rates of 99.1% and 95.8% (average 97.4%) for the Eglin-1 and RedTeam datasets, respectively, with overall average precision of (97.1%). And when evaluated on the dataset collected, it achieved 95.6% recall and 96.3% precision. Compared to other moving vehicle detection and tracking techniques found in the literature, the proposed framework achieves higher accuracy on average and is less computationally demanding. The quantitative results thus demonstrate the potential of the proposed framework for autonomous UAV traffic monitoring applications.