Sunday, May 8, 2011

Tale of Two Projected Cities: matching projected views of Google Earth onto physical models


A calibrated second projector projecting side textures from the Google Earth building view.

A video of the process

We're working on projecting Google Earth visualizations out into the real world onto 3D surfaces. This will allow users to walk around the 3D space without wearing 3D glasses and allows for collaborative interaction with the models. We are preparing a visualization of urban data to project onto this 3Dmodel of the skyline of Boston.

One challenge is to project a Google Earth view to match a physical model. We take advantage of the camera control parameters in image overlay that allows one to match a photo to a google earth scene. An example of photo matching to a google earth scene is shown in image with Air Force Once and limo on the PHX tarmac.
While it is possible to manually configure the camera parameters on the left, it is a very time consuming process and fraught with errors. This week we developed a technique using a non-linear least squares approach based on reference points in the model and the google earth scene. As we are using multiple projectors to cover physical models, we first calibrate an overhead projector and then use it as a reference scene from which all the other projectors can automatically calibrate their virtual cameras in their scene to match.