Skip to contentAdvanced Interfaces Group - School of Computer Science, The University of Manchester

Surface Depth Hallucination

Photograph of Mayan Glyphs at Chichen Itza. Synthetically relit image. Novel view rendering with surface specularity.
Original photograph of Mayan Glyphs at Chichén Itzá. Synthetically relit image with novel lighting. Novel view rendering of the relit surface with some uniform specularity added to the material model.

Surface depth hallucination offers a simple fast way to acquire albedo and depth for textured surfaces that exhibit mostly Lambertian reflectance. We obtain depth estimates entirely in image space, and from a single view so there are no complications that arise from registering texture with the depth obtained.

The user simply takes two photos of a textured surface from an identical position parallel to the surface; one under diffuse lighting conditions as might be encountered on a cloudy day or in shadow, and the other with a flash (strobe). From these two images together with a flash calibration image, we estimate an albedo map. We also estimate a shading image primarily from the diffuse lit image capture. We develop a model that relates depth to shading specififically tailored for textured surfaces with relatively little overall depth disparity. By applying this relationship over multiple scales to our shading image, we arrive at a per pixel height field. Combining this height field with our albedo map gives us a surface model which may be lit with any novel lighting condition, and viewed from any direction. Provided we have a suitable exemplar model, our method can also work from a diffuse lit image alone by histogram matching it with the albedo and shading images of the exemplar model further simplifying our data capture process.

We validated our approach through experimental studies and found that users believed our recovered surfaces to be plausible. Further, users found it difficult to reliably identify our synthetically relit images as fakes. Details of our method, and the results of our validation are to be published in Siggraph 2008 [.pdf (Preprint)].

Videos



You can download this and other videos illustrating our method from here:

Example Surfaces

Here we show a selection of relit images illustrating the wide variety of different kinds of surfaces that we used to test our method. Scaled versions of the full set of experimental stimuli used in our validation studies are available here. If you would like access to our high resolution images then please contact me via email (mashhuda at manchester dot ac dot uk).

Synthetically relit image of a brick wall. Synthetically relit image of a brick path with leaves. Synthetically relit image of a brick path. Synthetically relit image of a doormat.
Synthetically relit image of a drystone wall. Synthetically relit image of a headstone. Synthetically relit image of a rock wall. Synthetically relit image of woodchips.

Sample Models

A selection of our models are available here as tar, gzipped archive files. Each archive contains two files, an .obj format file containing a surface mesh and a high resolution albedo map in .tiff format. There models are provided free to use for any purpose but we would appreciate an acknowledgment.

News Stories

This work is being widely reported by the media. The following links are just a few of the stories that have turned up on the web: [New Scientist...] [Youtube video...] [Slashdot...] [Pressetext...]

Acknowledgments

Surface depth hallucination was developed as part of the Daedalus project, and funded by the UK-EPSRC under grant EP/D069734/1 (May 2006 - April 2009). We gratefully acknowledge the support of Kevin Cain and the Mayaskies project, for providing access to Chichén Itzá. We also thank Timo Kunkel for converting our SIGGRAPH video to a flash video.