Last time, we discussed the color of the “typical” sun, but also implied that there really is no such thing. So let’s try to understand what it really looks like at any given time, and more importantly, how if affects our photography. We’ll start by examining something really clever, called the “CIE 1931 color space chromaticity diagram.” The name’s not important, but hang in there for a few minutes while I explain it. Trust me, it’s worth your time, since it becomes the foundation of how we, and our cameras, perceive color.
Here are the three most important things you need to know about this diagram:
- Every color along the bold line around the perimeter of the horseshoe can be represented by a single wavelength (those numbers in blue). Simple.
- All the colors contained within the horseshoe area can be produced by various combinations of those single-wavelength colors. Want to try something interesting? Print this diagram (in color, please), and get yourself a pen and a ruler. Now draw a straight line between the 500 mark and the 600 mark. Next draw another straight line between the 480 mark and the 560 mark. See that point where those two lines intersect? That color, whatever you want to call it, can be made by combining either the 500-600 wavelengths or the 480-560 wavelengths. How many straight lines can you draw through that point? Well, that’s how many different ways there are to make the same color.
- You’ve heard of color temperature, right? Well, that black curved line (with the hash marks and the labels going from 1500 to 10,000) represents color temperature (in units of Kelvins).
Now, compare the colors along that color temperature line with the colors in the diagram below.
Notice any similarities? I sure hope so, since that’s the range of lighting conditions we see throughout the course of a day. And if we see it, so do our cameras. The difference, however, is that our brains do a darn good job normalizing these changing conditions so that objects appear to us as we expect them to appear, not as the current conditions would suggest that they should appear. An example of this is a scene shot in a shady spot on a clear, blue-sky day. Technically, something white or neutral in that setting should look blue to us, since it is lit primarily by blue light. But our brains, expecting the item to look white or neutral, make it look white or neutral (not blue). On the other hand, our cameras (not being quite so intelligent) need to be “told” what those conditions are so that they can provide us with the image we expect to see.
And that, in short, is what the term “white balance” is all about (and not coincidentally the topic of my next post).
Until then … Happy Shooting!Mike, a Pennsylvania native, is a metallurgical engineer and avid photographer. A graduate of Lafayette College in Easton, he is the president of Opus Technologies LLC. Mike enjoys experimenting with various photographic techniques, evaluating (playing with) new equipment, and discussing all aspects of photography with anyone who will listen. Discover his Candid Paw pet photography website, Facebook page, and etsy shop.