Working with Microsoft Xbox Kinects has been a large part of my research with professors Sean and Natasha Banerjee with TARS at Clarkson University. These gaming sensors actually capture high-quality video and depth data, but there is no way to sychronize from multiple stock Kinects to faciltate multi-view 3D motion capture. My main contribution was improving a hardware board which delivered timecode to the Kinect’s microphones, therby robustly sychronizing them, and then conducting experiements with this system. This work was published at the IEEE International Conference on Multimedia and Expo (ICME) 2018 and I presented it in an oral session. The paper can be found here and here is the presentation: PDF/PowerPoint.

This is a frame from a 3D reconstuction videos the system generates.