By definition, an HDR image contains more latitude in pixel brightness than can be displayed on a monitor. When you take a standard digital photo of something bright like the sky at sunrise, a short exposure may capture the sky properly, but the ground will be black. Re-expose to capture the ground and now the sky is blown out.
The really maddening thing about digital cameras if that you get to see the photo right after you take it. So you see your black ground on the screen but then look up to find that the ground isn't black at all. In fact, it may look about as bright to you as the much lighter exposure where the sky is completely gone. There you stand with your eyes easily able to see everything, a little annoyed with your camera's poor showing.
In a nutshell, the problem is dynamic range. You eyes can capture a wide dynamic range, but the camera is capturing a narrow one. HDR photography lets you combine different exposures to create a floating point image that represents the real world scene being photographed, sans your camera's limitations. But then when you want to show these scene-referred pixels on your monitor, you have to do some tone mapping.
Here are examples of tone mapping available in Adobe Photoshop and After Effects. The first four are the standard ones that ship with Photoshop, while the last two are made using a technique I'll show you.
1. sRGB / Gamma 2.2 curve
2. Highlight Compression
3. Equalize Histogram
4. Local Adaptation
5. Film Profile Tone Mapping
6. Film Profiles + color correction and film grain
The sRGB curve in #1 is what is typically displayed by default. A gamma curve is applied to the 0.0-1.0 range and the rest is clipped. This is what you'd get from just taking a regular photo.
Options 2-4 try to bring bright values down into visible range without crunching the other pixels too badly. You can see in each example that more detail has been brought out around the sun and the coarse clipping has been reduced. However, each technique has trade-offs. You'll notice that detail is lost in areas where a bright object comes close to a darker one such as the disappearance of a clear horizon line near the sun. Local Adaptation can sometimes produce halos in these circumstances as it fights to maintain contrast while bringing pixel values together. Sometimes the image comes out looking more like a painting than a photo.
Don't get me wrong, each image is different and these algorithms can produce some good results. But a tone mapping technique I have grown to like involves simulating the way film would capture that same scene. For one reason or another, people have always had a fondness for the look of film, and many movies are still shot on film for this reason. But what is it about the look of film that people like so much?
One explanation is that film is the original high dynamic range capture format. Even under long exposures a film negative is never absolutely 100% exposed, so the bright areas of highlights can retain detail that would be lost by a CCD.
After Effects CS3 ships with a series of ICC profiles that describe the response of different film stocks to light. Since an HDR image describes the actual light in a scene, converting from the linear HDR color space profile to a film stock profile using the Color Profile Converter creates a simulated negative in Log space (make sure you use the Absolute Colorimetric intent). From there, you can apply a Theatre Preview Profile, which simulates that negative getting printed on a film positive and the projected. So if you trust the color science involved, you're looking at your scene as if you had shot it for a Hollywood movie.
Looking at example #5, you can see that the results are pretty impressive. There is a good tradeoff between detail around the sun and detail in the rest of the scene. And the sun has a very nice film-style glow around it with no hard clipping at all. Since the film profile response is a simple curve, you don't get any of the artifacts of the other tone mapping methods. Finally, there is probably something about the simulated printed film that will remind you of all the movies you ever saw before the dawn of the digital age.
In this case I'm doing everything manually—I've disabled all color management by using no working space profile and checking "Preserve RGB" in the HDR files. The same process can be set up as a display LUT so that your project lives in linear, but gets viewed through the film preview. To do so, set a linear project working space, turn on Display Color Management in the View menu, and then turn use View > Simulate Output > Kodak 5218 to Kodak 2383, which uses the same settings I am. However, to actually output an 8-bit JPEG or TIFF with the film look, you'll have to set up the Color Profile Converter as I have.
In example #6, I have done a little more processing. By applying a Levels call to the original linear HDR pixels, I can do a realistic color correction where light is brought down in the scene as it would in nature, making it cooler in this case.
Then I add a little film grain while the pixels are in simulated Log space, in between the two Color Profile Converter applications. Add grain is a 16-bit effect, but this matters little because log space does not need floating point values over 1.0. In fact, Log is usually stored in 10-bit Cineon files, so you have more bits than a typical film scan would.
By switching the simulated film stocks, you can also see how your same scene—accurately stored as an HDR image—would appear using different stocks. Much cheaper than experimenting with the real thing.
Really great idea of tone mapping.
But, maybe I am missing something, if you convert from HDR to film profile why in your example picture in 'to film log' the input profile is set to sRGB. How can sRGB describe actual light in a scene?
The setup assumes you have a linear input, so we're choosing sRGB with the linearize box checked. The output of that first Color Profile Converter is then a log image.
Post a Comment