There are many different settings on modern digital cameras, especially on the high-end DSLRs. The user manuals are so thick and heavy that some manufactures aren't even providing a paper copy anymore. Often, the daunting task of learning what each of these buttons does and trying to figure out the intricate menus and sub-menus keeps photographers in auto mode so that they can quickly get the pictures taken and then, using their favorite software, do what they really envisioned while taking the photo. After all, there isn't a whole lot that can't be done to a photograph in a digital darkroom. But is it really best to take this approach? Are the default and "automatic" settings allowing the most flexibility once you get your photo home and on the computer? You may be surprised to find out that if you stick to don't get out of Auto your camera may be making more permanent decisions than you thought. Let me explain.
The camera knows best...right?
Today's cameras are more advanced than ever with the ability to take a look at a scene, find faces, detect smiles and some can even re-take a picture if somebody blinked. You might be asking, "If my camera can do these things, I can surely trust it with the basics such as picking an exposure, white balance and depth of field, right?" Well, yes and no, but mostly it comes down to the situation. In my experience, today's modern cameras do a pretty good job about 9 times out of 10 when they are not put into "extreme situations." Extreme situations might include something like looking towards a bright light or when taking pictures of something moving quickly. My experience with these types of photography show the opposite results and I am lucky to get 1 good shot for every 10 that I take. Since I like to photograph the outdoors, I quickly found sunrises, sunsets and shaded forests to be very difficult to expose correctly. Some of my photos would turn out with weird color that I couldn't recover on the computer or there would be dark areas that I couldn't lighten up to give the photograph a more realistic look. After doing some research, I realized that it wasn't a bad camera and I wasn't doing anything wrong, I just wasn't using my camera to its full ability and if you're experiencing some of the same issues, I can show you one quick step to better quality photos every time you pull them up on your computer.
JPEG is for Amateurs and RAW is for Professionals
If you think that only professionals shoot in RAW, you are not alone. I knew for a long time before ever trying it out that my camera had the ability to shoot RAW photographs, but I was under the impression that I didn't need to use RAW because I wasn't a "professional." At the time, I didn't even know what RAW was, but somehow I developed the idea that it just wasn't for me. And I couldn't have been more wrong.
Let's start with JPEG. A JPEG file is a compressed photo file that is saved to a memory card in the camera. JPEG is probably the most widely used photo file format in the world today. Our phones turn our photos into a JPEG file, JPEG files stored on a computer can be easily shared via e-mail to family and friends and nearly every camera, big or small has the ability to store it's pictures in JPEG format.
RAW files on the other hand are uncompressed (or less compressed) picture files. The simplest way to describe a RAW file is a "Digital Negative." This is all of the information that the camera's sensor saw while the shutter was open. Although RAW files are much larger and do take up more space on a memory card, when post-processing the files on the computer, much of this information can be very useful when it comes around to developing the picture. Since there are no in camera adjustments being made your RAW files may look more flat right out of the camera, but it is this that gives you the ability to fully adjust the white balance or color temperature of the photo if it didn't turn out right and even recover details that a would not be possible in a compressed file.
So...RAW is better. Why would I use JPEG?
There are many who believe that RAW is the only way to take pictures. You capture the information and have unlimited editing abilities when you get to the computer. I would agree with this nearly 100% of the time. There are some advantages to shooting in JPEG though...at least for certain types of photography. If you're a photojournalist, sports or event photographer that requires a quick turnaround, JPEG may be a better option. Since part of the compression of JPEG is getting rid of parts of the photo that aren't necessary, the camera needs to know what to get rid of. By adjusting some settings, the camera will do some basic editing to the photos before even saving them to the memory card. This makes the file smaller and creates quick sharing abilities. The downside is that you lose a lot of the photo data so there are less changes that can be made in your photo software so it is important to have the settings correct before shooting. If your white balance is set wrong or you have the sharpness set too high or low, you will end up with the entire batch of photos being unusable (and some of these won't be caught even on the camera's screen).
If you want the best of both worlds, a lot of cameras will allow you to shoot in both modes at the same time. This gives you a quick JPEG file to send off for publication as well as a RAW file to do more edits to if necessary.
When viewing photos on the internet or in a local gallery, you probably won't be able to tell if the picture was taken in JPEG or RAW format. Most all of the benefits of shooting RAW are only present when editing your pictures. If you want to take your photos to the next level and try shooting in RAW all you need is a camera the supports RAW, photo software that supports the RAW format your camera uses (Adobe Photoshop and Lightroom both work), and a subject to take pictures of. Next time you're out shooting for fun or work, take some RAW photos and see how it changes your workflow as well as your end results.