At the core of what we do at Domeble is the 360° HDRI, which in its simplest terms is a fully spherical panorama image from the Nadir to the Zenith of the scene to be photographed, and then captured with a huge dynamic image range of up to 36 f-stops, hence the name High Dynamic Range Image.
As the founder of Domeble and one of the original adopters of CG photography and 360° HDRIs ( or 360 HDRI maps as they are often referred to) in the early 2000s, I have spent 1000’s of hours capturing, experimenting, and developing techniques and capture methods with this genre, and it struck me recently that we as creators overlook why we need this format, and indeed how did these techniques arrive in the industry, and what history lies behind that arrival into the world of 3D.
I guess more than most, I am aware of this history, and can recall vividly the exciting discussions at CGAM 2008 in LA with like-minded curious photographers and CG artists when the world was waking up to creating images with photo-realistic renders with just 3D CAD data, and just how scary this new world of image making looked.
There were several keynotes during that conference that I believe form the foundation of what we do today in the world of VR and CGI, but… How did we get here?
The fundamental ideas behind HDR imaging date back to the early 20th century when researchers and photographers explored ways to capture scenes with a higher dynamic range, and the great Ansel Adams was a prominent voice in sharing his exposure Zonal system and developing techniques to expand the dynamic range of the scene he photographed.
Early attempts included techniques like multiple exposures and darkroom manipulation, but these methods were limited in their ability to achieve true HDR. It wasn’t until 1986 that the first digital HDRI was captured ( yes 1986 digital capture!!!) by Kodaks research labs, and this was done by using CCD to handle and ultimately display a high dynamic range image.This development continued into the 1990s were significant research programmes started emerging in the field of computer graphics.
Paul Debevec ( without doubt the godfather of HDRI in CGI) developed a technique called image-based lighting ( IBL ), which became crucial in creating realistic lighting and reflections in computer-generated scenes. He and his team at UC Berkeley created the landmark “Debevec Light Probe,” which was a chrome sphere covered in grayscale patterns used to capture the illumination information from real-world environments. This technique involves using HDR environment maps (captured using the Debevec Light Probe or similar methods) to illuminate CG scenes realistically.
Instead of traditional artificial light sources, IBL uses the captured environment as the primary light source, resulting in more natural and convincing lighting in computer-generated imagery. and this approach allowed for an accurate representation of lighting in CG scenes and ultimately allowed rendered images.
Debevic and his team further extended their research to capture the reflectance properties of real-world materials using a technique called “Reflectance Transformation Imaging” (RTI). This approach allowed for the creation of more realistic CG materials by capturing how light interacts with real surfaces under different lighting conditions. This probably opened an exciting can of worms for Debevic, and once he had established the science of IBL and RTI, he developed “Light Stage” technology, which is a spherical array of high-resolution cameras and lights capable of capturing facial expressions and appearance with incredibly high detail. This technology has been used in numerous movies and video games to create lifelike digital characters.
As HDR imaging gained popularity in the early 2000s, photographers and 3D artists started using multiple exposures of panoramic scenes to create 360° HDRI. This involved capturing a series of bracketed exposures at different exposure levels to cover a wide dynamic range and then stitching them together to form a high-resolution, full 360-degree HDR image. From here, 360° HDRI quickly found applications in computer graphics and visual effects industries. It became a crucial tool for capturing lighting information from real-world environments and using that data to realistically illuminate computer-generated scenes, objects, and characters, just as Debevec had experimented with and developed back in the 90s.
There is no doubt about the crucial role Debevec played in advancing HDR imaging, image-based lighting (IBL), and 360° HDRI. Debevec’s work on the Debevec Light Probe, image-based lighting, and Light Stage technology has had a profound impact on the CGI industry and visual effects in movies, but he is not alone in his development of the HDRI medium, there are a few others that we all owe a debt of gratitude to, and it is worthwhile shouting out these pioneers and the contribution they have all made to allow us to be creatively free in our CG explorations.Let’s start with Greg Ward, the computer graphics expert and the creator of the Radiance software, which is widely used for rendering and simulating lighting in 3D scenes. Radiance was instrumental in handling HDR data and was among the early tools that facilitated the use of HDR environment maps and 360° HDRI in CGI.
Then there’s Timothy Hawkins, who along with Greg Ward, developed the “Logluv TIFF” format, a logarithmic encoding scheme for storing HDR images, which became a standard for HDR data storage and manipulation, and Paul Bourke, a researcher and computer graphics expert who made significant contributions to the field of panoramic imaging. His work on panoramic photography and image stitching techniques laid the groundwork for 360° HDRI.
There is one unspoken hero in this story for me, and that is Bruno Mercer, a researcher, and developer who contributed to the development of the PTGui software, a popular tool for creating high-quality panoramic images. PTGui was used by many photographers and CGI artists to generate 360° HDRI. Without PTGUI, we would all have pretty terrible workflows, and here at Domeble, it’s the glue in many of our processes. It’s easy also as an image creator to overlook the gaming industry, and credit has to go to John Carmack, who played a role in popularising 360° HDRI in the gaming industry. He implemented HDR rendering and lighting techniques, including the use of 360° HDRI, in the video game “Doom 3,” which showcased the potential of HDR in gaming visuals.
Final shout-out in this brief 360 HDRI history must go to ILM ( Industrial Light & Magic ), Weta Digital, and Digital Domain, all Hollywood VFX giants in their own right, who have all played a significant role in adopting and pushing the boundaries of 360° HDRI. They integrated HDR imaging techniques into their pipeline, using them in blockbuster movies to achieve photorealistic lighting and reflections. It’s important to note that the development of 360° HDRI has been a collaborative effort, with numerous researchers, photographers, and artists contributing to its evolution.
These pioneers have collectively revolutionized computer graphics, virtual reality, and visual effects, making realistic lighting and rendering possible in the digital realm, and I feel proud that Domeble continues to push the boundaries and developments of the genre through our intensive and explorative R&D program