The Motivation:

Display hardware does not have a linear response to image input intensity. Monitors are calibrated so that a zero-intensity input yields a zero-intensity output (black data displays as black), and a full-intensity input yields a full-intensity output (white data displays as white). But an input intensity in the midrange (expressed as a number somewhere between 0 and 1) is actually displayed at a reduced intensity. This is a consequence of the power electronics used to convert an input voltage into visible light.

The general equation for this is:

Output = Input Gamma

Where

Output is the output intensity, between 0 and 1
Input is the input intensity, between 0 and 1
Gamma is a positive constant, usually greater than 1, characteristic of the hardware being used

For most monitor hardware, Gamma tends to be fairly standard at around 2.5 (and we say that such a monitor "has a gamma of" 2.5). Because of this relationship, input values very near to zero or full intensity appear slightly darker than intended, while colors near to 50% intensity are strongly darkened. The end result is that images generally appear much darker than intended, overall.

It should be noted that 2.5 isn't some fundamental magic number, but just a coincidence of present day monitor design, and it is entirely possible for other display devices to have very different gamma values. One could even imagine devices with gamma less than 1, which would have the effect of an unwanted overall increase in image intensity. More on this under the topic of 'system gamma', below.

Lastly, the above relationship applies to each color channel independently (Red, Green, Blue), with the ultimate effect that the output hue and saturation can and do display incorrectly. For example, consider a pixel where the Blue input is near 50%, while the Red and Green inputs are closer to full intensity. The output will have a sharply reduced Blue channel, but only weakly reduced Red and Green. The result will be an overall reduction in brightness, but an increase in saturation, and a hue shift toward yellow.

Gamma Correction:

Gamma correction adjusts the ratio of input values to output values upward or downward toward 1, according to

Corrected = Original (1 / Gamma)

The value (1 / Gamma) is chosen to exactly compensate for the erroneous change in intensity caused by the monitor. The corrected version of the original image data is sent to the monitor, and the image is then displayed with the correct color and brightness throughout.

System Gamma:

As users, what we really care about is the total gamma experienced by an image evaluated across all the processing it is subjected to before the light reaches our eyes - and of course, using gamma correction in an appropriate place to get this overall 'system gamma' as close to 1 as possible. This means taking into account hardware other than the monitor itself, as well as any software that plays a role in displaying the image.

Some video cards, as well as certain applications and even operating systems, perfom their own (often partial) gamma correction. If the monitor has a gamma of 2.5, your video card offers partial gamma correction of 1.6, and the operating system provides gamma correction of 2, then the system gamma would be evaluated as the cumulative effect of each change in image intensity:

System Gamma = (1 / 1.6) x (1 / 1.2) x 2.5 = 1.3

This is closer to ideal than 2.5, but is still not perfect, therefore the application software displaying the image must provide an initial gamma correction of 1.3. As an example of system gamma, PCs do not provide any built-in gamma correction (so the system gamma is generally the full 2.5 of the monitor), while Macintosh systems provide a correction of 1.4, which evaluates to a system gamma of 1.8.

What it means to you:

Many applications provide user-configurable gamma correction capabilities, notably graphics software such as Adobe Photoshop, and many games. In the graphics domain, gamma correction is used primarily to guarantee that the same image appears the same way when displayed on different hardware using different software. In the gaming domain, a higher-than-optimal gamma correction is sometimes used as a benign way of cheating, by brightening dark and difficult-to-see areas of the screen.

Log in or register to write something here or to contact authors.