Please see the wiki for installation instructions.
OpenPTrack is an open source project launched in 2013 to create a scalable, multi-camera solution for person tracking. It enables many people to be tracked over large areas in real time.It is designed for applications in education, arts, and culture.
Our objective is to enable “creative coders” to create body-based interfaces for large groups of people—for classrooms, art projects and beyond.
Based on the widely used, open source Robot Operating System (ROS), OpenPTrack provides: user-friendly camera network calibration; person detection from RGB/infrared/depth images; efficient multi-person tracking; UDP and NDN streaming of tracking data in JSON format.
With the advent of commercially available consumer depth sensors, and continued efforts in computer vision research to improve multi-modal image and point cloud processing, robust person tracking with the stability and responsiveness necessary to drive interactive applications is now possible at low cost. But the results of the research are not easy to use for application developers.
The project contains numerous state-of-the-art algorithms for RGB and/or depth tracking, and has been created on top of a modular node based architecture, to support the addition and removal of different sensor streams online.
For more information, see the website.
OpenPTrack is led by UCLA REMAP and Open Perception. Key collaborators include the University of Padova and Electroland. Code is available under a BSD license. Portions of the work are supported by the National Science Foundation (IIS-1323767).
If you use this code, please cite:
M. Munaro, A. Horn, R. Illum, J. Burke and R. B. Rusu. OpenPTrack: People Tracking for Heterogeneous Networks of Color-Depth Cameras. In IAS-13 Workshop Proceedings: 1st Intl. Workshop on 3D Robot Perception with Point Cloud Library, pp. 235-247, Padova, Italy, 2014.
M. Munaro and E. Menegatti. Fast RGB-D people tracking for service robots. Journal on Autonomous Robots, vol. 37(3), pp. 227-242, Springer, 2014.