The next iOS update (4.1) will come with high dynamic range (HDR) photo function for iPhone’s built-in camera. According to Steve Jobs, the HDR function works by snapping 3 photos in rapid succession. One of the photos is underexposed, one is normally exposed, and one is overexposed. The software will then process the three photos to create a new one that contains more details in the highlight and shadow areas that otherwise would have been lost due to the limit dynamic range of the image sensor.
This sounds good but how well does it work?
Not many people know since the new iOS version has not been officially released to general public (rumors say September 8th, 2010). However it has been released to iPhone developers a few days ago. Wired Magazine’s Gadget Lab produced a hands-on test of the new HDR feature. From the samples posted, this feature does seem to work. More details are preserved in the skies and shades.
For normal HDR photos taken with a camera, a tripod is almost a must to avoid shifting of the images in the consecutive captures. Blending three mis-aligned photos will certainly lead to a blurred photo. Can the iOS HDR feature avoid the camera shake problem? How fast are the three shots?
Keywords: HDR, High Dynamic Range, iOS