(Title inspired by Frank Foster's "The Story of Computer Graphics")
Some of the graphics research I've worked on builds on the techniques of reflection mapping and environment mapping developed in the late 1970's and early 1980's. I had a paper about the work at SIGGRAPH 98 which can be found here. This web page tells the story of how the original techniques came to be.
In my SIGGRAPH 98 paper I referenced reflection mapping using synthetically rendered environment maps as presented by Jim Blinn in 1976:
|Blinn, J. F. and Newell, M. E. Texture and reflection in computer generated images. Communications of the ACM Vol. 19, No. 10 (October 1976), 542-547.|
I met with Jim Blinn in June 1999 during a visit to Microsoft Research, and by coincidence he was in the process of organizing some old files, including the images from this paper. The first environment-mapped object was the Utah Teapot, with a room image made with a paint program (which Blinn wrote) as the environment map:
In the paper, Blinn also included an image of a satellite, environment-mapped with an image of the earth and the sun which he drew, shown below. Note that in both cases the objects are also being illuminated by a traditional light source to create their diffuse appearance.
More images from Blinn's early environment mapping work may be found here.
I was surprised in writing my SIGGRAPH 98 paper that there didn't seem
to be a good reference for using real omnidirectional
photographs as reflection maps. This seemed odd, since the
technique is in common usage in the computer graphics industry, and
was used in creating some of the more memorable movie effects in the
80's and 90's (e.g. the spaceship in Flight of the Navigator
(1986), and the metal man in Terminator II (1991)). So at the
SIGGRAPH 98 conference I spent some time researching the history of
While at SIGGRAPH 98 in Orlando, I talked to Paul Heckbert, Ned Greene, Michael Chou, Lance Williams, and Ken Perlin to try to find out the origin of the technique. The story that took shape was that the technique was developed independently by Gene Miller working with Ken Perlin, and also by Michael Chou working with Lance Williams, around 1982 or 1983. I heard that the first two images in which reflection mapping was used to place objects into scenes were of a synthetic shiny robot standing next to Michael Chou in a garden, and of a reflective blobby dog floating over a parking lot.
A few months later, with the help of Gene Miller, Lance Williams, and Paul Heckbert, I was able to see both of these images side-by side:
|A reflection-mapped blobby dog floating
in the MAGI parking lot.
(Courtesy of Gene Miller)
|A reflection-mapped robot standing next
to Michael Chou.
(In hi-res courtesy of Lance Williams)
|In January 1999, Gene Miller sent over a wealth of information and images about his knowledge of the origin of the technique. Click here to continue on to Gene Miller's stories and images about the development of reflection mapping.||
Mike Chou and Gene Miller at SIGGRAPH 99.
The Chou and robot image appeared in Lance Williams's 1983 SIGGRAPH paper "Pyramidal Parametrics". The paper introduced MIP-mapping, an elegant pre-filtering scheme for avoiding aliasing in texture-mapping algorithms. MIP-mapping has since been implemented on scores of graphics architectures and is used everywhere from video games to PC graphics to high-end flight simulators. The reflection-mapped robot image was just one example used to demonstrate the technique.
|Williams, Lance, "Pyramidal Parametrics," Computer Graphics (SIGGRAPH), vol. 17, No. 3, Jul. 1983 pp. 1-11.|
In talking to Gene Miller, I learned that in December 1982 he and Robert Hoffman submitted a paper on the technique to SIGGRAPH 83 but it was not accepted for publication. However, a revised version of this work appeared in the SIGGRAPH 84 course notes on advanced computer graphics animation:
| Gene S. Miller and C. Robert Hoffman. Illumination and Reflection Maps:
Simulated Objects in Simulated and Real Environments. Course Notes for
Advanced Computer Graphics Animation, SIGGRAPH 84.
Noteworthy is that the notes describe how reflection maps can be pre-convolved to render diffuse and rough specular reflections, that the maps can be conveniently stored in perspective images using six cube faces, and that issues arising from the limited dynamic range of film could be addressed by combining a series of photographs taken with different f/stops.
In 1985, Lance Williams was part of a team at the New York Institute of Technology that used reflection mapping in a moving scene with an animated CG element. The piece "Interface" featured a young woman kissing a shiny robot. In reality, she was filmed kissing a 10-inch shiny ball, and the reflection map was taken from the reflection of the ball. To make the animation, the reflection map was applied to the robot, and the robot was composited into the scene to replace the ball.
|"Interface", courtesy of Lance Williams.|
Interface appeared at the SIGGRAPH 85 film show and is the first use of photo-based reflection mapping in an animation, and also its first use to help tell a story. The woman quickly kisses the robot and then heads out for the evening. As the silent robot waves goodbye, her reflected image recedes, leaving you to wonder what the poor automaton might be left to do for the evening.
was also worked on by Carter Burwell and Ned Greene, and the actress
was Ginevra Walker. Carter Burwell later composed music for
feature films such as Raising Arizona, Miller's
Crossing, The Hudsucker Proxy, and Barton Fink.
Lance Williams shortly thereafter added reflection mapping (as well as
texture, bump, and transparency mapping) to Pacific Data Images'
renderer, which was used to create Jose Dias'
"Globo" reflection mapping images.
The first feature film to use the technique was Randal Kleiser's Flight of the Navigator in 1986. C. Robert Hoffman was part of the effects team that rendered a shiny morphing spaceship flying over and reflecting fields, cities, and oceans. The technique was recently revisited to render the reflective Naboo spacecraft in Star Wars: Episode I.
|Stills from Randal Kleiser's 1986 film Flight of the Navigator, demonstrating reflection mapping in a feature film.|
Also in 1986, Ned Greene published a paper further developing and formalizing the technique of reflection mapping. In particular, he showed that environment maps could be pre-filtered and indexed with summed-area tables in order to perform a good approximation to correct anti-aliasing. Greene combined a real 180-degree fisheye image of the sky with a computer-generated image of desert terrain to create a full-view environment cube.
| Ned Greene. Environment Mapping and Other Applications
of World Projections. IEEE Computer Graphics and Applications,
Vol 6. No. 11. Nov. 1986.
Reflection Mapping made its most visible appearance to date in a pair of groundbreaking films by James Cameron. Following the use of reflection mapping and shape morphing in Flight of the Navigator and other examples seen at SIGGRAPH and commercials, visual effects professionals including Dennis Muren, John Knoll, Lincoln Hu, Scott Anderson, Jay Riddle, and Mark Dippe at Industrial Light and Magic further evolved the techniques to help create the amazing looks of the water creature in The Abyss and the T1000 robot in Terminator 2.
In 1993, Paul Haeberli and Mark Segal published a survey of innovative uses of texture-mapping. Reflection mapping was one such application, and they demonstrated the technique by applying an environment map taken in a cafe to a torus shape:
|A reflection mapping still from Haeberli and Segal, 1993.|
| Paul Haeberli and Mark Segal. Texture Mapping as a Fundamental Drawing Primitive. Fourth Eurographics Workshop on Rendering. June 1993, pp. 259-266.
Paul Haeberli photographed the "cafe" environment map in Cafe Verona in Palo Alto, CA. He used a Nikon film camera with 180 fisheye lens that he borrowed from Ned Greene at NYIT. To create the full 360 degree environment, Haeberli mirrored the front 180 degree image to create a plausible back 180 degrees, and then stitched the two images together to produce a complete environment. This environment map, along with another one shot in a flower garden, shipped as the two environment map choices in Silicon Graphics' popular real-time shading demos in the 1990's. In December 2003 Haeberli unearthed the original fisheye image used to create the Cafe environment map, and the full 360 map derived from it, shown below.
|Paul Haeberli's original 180-degree fisheye image||The assembled 360-degree Cafe environment map.||A bust of Beethoven environment-mapped in the real-time SGI environment mapping demo.|
Paul Debevec / firstname.lastname@example.org