Our cameras can see more than we do—but because we prefer to photograph the world the way humans see it, cameras are designed to ignore certain types of light that aren’t normally visible. “Normally in photography you don’t want to shoot any ultraviolet light,” explains photographer Richard Robinson.
We can’t see ultraviolet light—the type of light that gives you sunburn, the type that sunglasses are designed to block to prevent damage to your eyes. We also can’t see infrared light, which is past the other end of humans’ visible spectrum.
Digital cameras contain a filter on the sensor to block ultraviolet and infrared wavelengths from entering, and lenses have coatings to block ultraviolet light. Photographing how something looks under ultraviolet light involves removing this filter, greatly expanding the sensitivity of the camera.
In the field at Boundary Stream, Robinson set up two cameras to shoot in parallel—one documenting ultraviolet (UV) and the other red-green-blue (RGB) light. At Auckland Museum, he used one camera, replacing a filter than only allowed UV light through with one that only allowed RGB wavelengths in order to capture the same image in different types of light.
The final images you see in this magazine are a blend of the ultraviolet and red-green-blue images—an approximation of how we think birds see, and how birds look to other birds, based on the research conducted so far. For most of the bird species pictured, this is the first-ever attempt to visualise how they may appear to each other, not just to human eyes.
These images are the result of close to a year of experimentation and modifications to create a full spectrum camera. The next question is—how does the rest of the world look in UV?