In preparing for an upcoming vacation I've been busy practicing with my Nikon D70 DSLR, and I've been sucked once again into the debate between using that camera's native sensor image (camera raw) or using standard JPEGs. Although I upgraded from Photoshop 7 to CS2 a while ago I haven't played with it too much, so this seemed like a perfect opportunity for me to do my own tests.
I took two sets of photos, one indoor in low light and one outdoors with bright sun. Each set consisted of sixteen photos. I used four different ISO equivalents (200, 400, 800, and 1,600) and four different image quality settings (raw, fine, normal, and basic).
After taking all the photos I opened up four at a time from the same set and ISO group in Photoshop and visually compared them. The JPEGs open up without any additional work, but trying to open a raw file invokes the Adobe Camera Raw (ACR) plug-in. ACR wants to twiddle the color, contrast, white balance, etc. for you, but ⌘-U will undo all those tweaks and just perform the most straight-forward mapping of sensor data to visible image.
Of course, part of the problem with camera raw is that there is no "one true way" to interpret the sensor data, so every different software program will produce a different image from the same raw data. Similarly, every digital camera has its own firmware interpretation of the sensor data that it applies when creating its JPEG images. Sometimes this firmware is very good, sometimes not, and sometimes it produces something that's just not aesthetically pleasing to the individual.
The decision to take the raw data and process it outside of the camera (in Photoshop or similar) tremendously slows down the workflow, though, and storing all that raw data eats up storage space (both in the camera's memory card and on your computer) quickly as well. With my camera, the raw data takes up about five megabytes, while the normal JPEG takes only one and a half megabytes.
So how does my camera rate?
At the sizes that I'll be viewing, both online and in print, I can't see a significant difference in almost all instances. Sure, if I blow the images up to 200% on-screen I can see some JPEG compression artifacts around high-contrast areas in the "basic" quality. And, I can also see a bit more chromatic aberration in the "basic" quality as well. But, even when printed at 8" X 10" on semi-gloss paper with an Epson R800 I can't see those details.
The only images I could see a meaningful difference between was the indoor shots at 1600 ISO equivalent. Here the camera-produced JPEGs showed a lot more sensor noise than ACR showed, enough that it was visible in print at 8" X 10". So, if I'm shooting very low light and I have to max out the ISO equivalent to keep my shutter speeds reasonable, it'll probably be worth it for me to switch to raw. I could probably remove the noise from the JPEG with a Photoshop filter, but it seems more convenient to let ACR handle it automatically.
Yea! This means I can store around 1,100 images on a 2 GB memory card—plenty of storage for even my most photo-intensive days. "JPEG Normal" for me!