Sunday, 23 January 2011

A580 Focus Shift under Fluorescent Lighting

When I first acquired an A580, it displayed signs of back focus. I sent it in to Sony Service to get it adjusted. After getting it back, I tested to see if the problem had been fixed. Initial tests in daylight were positive, but when I tested it at home under fluorescent lighting, I still found substantial back focus problems. One of my favourite lenses is the 100mm/2 and this seemed particularly affected, but all my f/1.4 to f/2 lenses were rendered unusable wide-open under fluorescent lighting. Noticeable degradation of resolution at the point of focus is also evident in f/2.8 lenses due to focus shift towards the margins of the depth of field.

Fig. 1: AF under incandescent illumination. A580 with 100mm/2 at f/2. Focus is acceptably accurate.

Fig.2: AF under fluorescent illumination. A580 with 100mm/2 at f/2. Focus is behind the target leading to significant loss of resolution.

I decided to dig a bit deeper into the problem to try to find out more about what causes the problem. Under daylight and incandescent (blackbody radiation) lighting, focusing was accurate. Under all kinds of fluorescent lighting, the focus was rear of the actual point. A working hypothesis is that the spectral lines in fluorescent lighting, coupled with longitudinal chromatic aberration in the AF light path, is what causes the phase-detect AF system to focus behind the subject.

Prompted by a suggestion by David Kilpatrick, I printed out a coloured target against a black background to see what wavelengths of light are to blame for the mis-focus. The different pigments in my inkjet printer will absorb various wavelengths of light and hopefully from the different combinations of reflectance, it may be possible to broadly identify the spectral region of interest. All the following crops are from the A580 with the 100mm/2 at f/2 and Auto ISO (generally ISO1600), exported from Lightroom with only white balance and minor exposure changes.

Fig. 3: Colour Target for AF Testing. Colours are supposedly (clockwise from top left) green, red, yellow, magenta, blue, cyan. The gamut of my printer in this particular paper is restricted hence the rather inaccurate colours.
Under incandescent lighting, all colours were accurately focussed. Three target colours, green, cyan and blue, gave no problems under fluorescent lights, but red and yellow gave out of focus results. Magenta was also out of focus under fluorescent lights but not quite to the same extent as red and yellow.

Fig. 4: Green target under fluorescent lighting, AF.

Fig. 5: Magenta target under fluorescent illumination, AF. Though not as degraded as the red target, this is clearly not in focus.

Fig. 6: Red target under fluorescent lighting, AF. Yellow target shows similar results.

Fig. 7: Red target under fluorescent lighting, MF. Exposure difference due to using manual focus check Live View mode.

Fig. 8: Red target under incandescent illumination, AF.
From these results, one can surmise that it is a spectral peak in the yellow to red region of the spectrum which is causing a spurious displaced image in the phase-detect AF system which causes the backfocus. Backfocus is approximately 11 cm for a target at 3 m which indicates a shift of the image plane by approximately 140 microns. In comparison, the depth of field is only 3cm in front and 4 cm behind the subject, and the depth of focus is only 35 microns.
Fig. 9: Green target under incandescent illumination, AF. This is even sharper than the comparative shot under fluorescent lighting.

Under illumination by a continuous spectrum, a distinct image would not be be rendered at a different position, hence the accurate AF under incandescent lighting which has a lot of yellow/red light compared to green/blue. Under low pressure sodium street lighting (spectral lines at 589.0nm and 589.6nm), backfocus is very evident. However, backfocus is also seen under LED illumination which generally has a broadly continuous spectrum hence a more complete explanation is still required.

In comparison, the A700 does not seem to display such a focus shift under different lighting. It is unfortunate that the A580 has this trait as the widespread adoption of fluorescent lighting (and increasingly LED) means that the A580 is handicapped in its use of large aperture lenses, especially in conditions where they are needed most. As the A700 AF system seems more robust to changes in illumination, Sony should incorporate those elements responsible for its reliable performance under a range of lighting into future models. Ideally, they should also develop an improved AF module for retrofit into affected A580 cameras.

Friday, 14 January 2011

Part II: How a camera works and takes a photo

In this part of my guide for beginners, I'll explain how cameras form images.

The main elements of a camera are the lens, and the sensor. The lens captures light, "processes" it, and then sends an image to the sensor. The sensor records the image falling onto its surface. The rest of the camera supports the these two main ingredients.

The Lens

The lens, very simply, takes light travelling along one set of paths and directs them along different paths. The two important attributes of a lens are how much light is gathered, and how much the light is bent. A lens will usually be specified by a set of numbers and characters, e.g. 100mm f/4 or 24-105mm/3.5-4.5.

The first, how much, is the light gathering ability, or effective aperture of the lens. It is the effective size of the hole through which light can make it through to the sensor. The bigger the hole, the more light can be gathered in any given amount of time, hence photos can be taken "faster". This is why large aperture lenses are called "fast". The size of the aperture is usually given as a ratio of the focal length f (discussed below), e.g. f/4. This means that the diameter of the effective aperture is one fourth the focal length of the lens. Confusingly, the "/" is omitted, e.g. f4 or F4. A larger aperture is denoted by a smaller number (dividing the focal length by a smaller denominator gives a bigger aperture).

The second attribute, how much the light is bent, is the focal length of the lens, e.g. 100mm. A long focal length lens bends light less than a short focal length lens. Only light coming from near straight ahead will end up bent enough by a long focal length lens to make it to the sensor, compared to a short focal length lens, hence a long focal length lens will have a smaller angle of view. This gives a magnified view of the scene. A short focal length lens will bend  light more, hence light coming in from a greater angle will make it to the sensor giving a wide angle of view.

The angle of view of a particular focal length on a particular sensor size can be visualised easily using a rectangular cut-out of the same size of the sensor, and a piece of string the same length as the focal length of the string. By holding up the cut-out at the same distance from your eye (using the string to gauge the distance), the view through the cut-out will show the angle of view "seen" by the sensor looking through the lens. By bringing the aperture closer, the angle of view increases, putting it further away narrows the view.

A zoom lens can vary its focal length, by moving lens elements around, either internally or by telescoping the whole length of the lens. On most lenses, this will also affect the effective aperture of the lens, e.g. 24-105mm f/3.5-4.5 means that the lens has an effective f/3.5 aperture at 24mm focal length setting, and f/4.5 at the 105mm focal length setting. A few lenses are designed so that the aperture stays constant throughout the zoom range, e.g. 70-200mm/2.8 will have a constant f/2.8 aperture regardless of the focal length setting.

The aperture specifications discussed above refer to the maximum aperture that the lens is capable of. Usually, lenses can be set to close down the aperture, e.g. an f/4 lens could be set to f/11 for a particular shot. The minimum aperture is sometimes also quoted, typically in the range f/16-f/32, but this is usually less of a concern in most situations.

The Sensor

The sensor records the light falling upon it. In times gone by, this would have been a piece of film. Nowadays, the sensor is typically a piece of silicon upon which specialised electronic circuits have been etched.

Film consists of a backing material (polyester, glass etc.) upon which an emulsion of silver halides would be coated. Silver halides are light sensitive, they react when exposed to light. This reaction can be made visible by processing with chemicals to develop the latent silver grains. The unexposed silver halide could be disolved away by fixer leaving the exposed silver grains as a negative record of the light falling onto the film. This negative could then be used to project an image upon photographic paper (which works essentially the same way as photographic film) so that you get a negative image of the negative which results in a positive image.

A silicon CCD or CMOS imaging sensor operates using the same basic principle. When visible light falls on silicon, the light is absorbed and in the process, an electron is released. This free electron can then be collected in a photodiode in each pixel. The difference between CCD and CMOS is in the way that these collected photoelectrons are measured. In a CCD (charge coupled device), the electrons are transported, bucket-brigade like, to the periphery of the imaging area to special sense nodes where the total charge collected in each pixel is then measured. A CMOS sensor does not transport the electrons, but instead uses in-built transistors surrounding each pixel in order to measure the charge in situ.

The charge in each pixel is measured by a charge-voltage transducer. By varying the gain of this transducer, the effective ISO or sensitivity of the sensor can be altered. The voltage is then digitised by a ADC (analogue to digital converter) to give a number which is then recorded by the camera.

Taking a Photo

We will assume that we are using a compact digital camera, there are minor variations for SLRs and other types of digital cameras. The photo taking process then proceeds as follows:
  • Composition. The scene to be taken is framed.
  • Focus. The camera will adjust the lens so that the objects of interest are in focus. This is achieved by moving elements of the lens, or the whole lens itself in relation to the sensor.
  • Metering. The camera evaluates how much light to let through the lens and strike the sensor. This is controlled by two factors, the working aperture of the lens and the length of time the shutter stays open. The camera may also determine an effective ISO of the sensor for use in the readout phase.
  • Aperture closure. The aperture is closed down to the working aperture determined in the above step.
  • Shutter opening. Depending on the camera, the shutter may have already been open in order for the composition, focusing and exposure steps above. But before the actual taking of the photo, the shutter will usually be closed in order to reset the sensor (make it blank). When the actual photos is taken, the shutter will open to allow the image forming light to fall upon the sensor.
  • Sensor exposure. The light gathered by the lens is directed onto the sensor. This liberates free electrons from the silicon which is collected in each pixel.
  • Shutter closure. At the end of the exposure, the shutter closes to prevent extraneous light reaching the sensor.
  • Sensor readout. The photoelectrons are now measured and recorded.
To make better photos, the main elements to consider are composition, focus, aperture, shutter speed, and ISO. In the next section, we will see how these elements interact to affect the final photo.