If you've ever used Apple's PhotoBooth software to take a picture of yourself with the webcam built into the screen, you'll notice the display turns white when the picture is taken. The idea is to use the monitor as a flash bulb - it illuminates your face and produces a better image in low-light conditions.
We can extend this idea to improve head tracking and face tracking applications. First, we capture an image with the display darkened. In the next instant, we capture an image with the display flooded white. The light from the display will illuminate the user's face, but it also has the useful property of not illuminating stuff in the background. If we subtract the first image from the second, we get an image that contains depth information about the scene; closer objects are brighter.
This method obtains an image of the user's face with trivial image processing. In a practical implementation, the display can be flashed just once to kick off the tracking process and provide an accurate tracking target for other head tracking algorithms. The localized face image can be used to initialize fast tracking algorithms such as CAMSHIFT, or to train a Haar classifier.
"The Laptop-Based 3D Scanner." The potential exists to gather detailed depth information by illuminating the subject with light from different areas of the display. For example, the subject would appear different when illuminated by light from the left edge of the display than when illuminated by light from the right edge of the display. The surface normals of the subject can be estimated by computational photography techniques, and a crude 3D representation of the subject could be derived.
Chris Valente
/ March 30, 2011 QuoteVery cool stuff, Matt!