Real-world technical info:
Gamma Explained

Technical briefing: Gamma

If you’ve worked in design for any appreciable time you’ll have heard the term ‘gamma’ bandied about in reference to computer displays, possibly along with the assertion that the Mac’s gamma setting is 1.8 and the average PC’s gamma is 2.2. So what on earth is this gamma thing, then? Well, just read on. You may wish you'd never asked!

Gamma refers to the response of CRT (cathode ray tube) displays to input voltage, or what’s displayed when they are fed a monitor signal. These electron gun-driven devices don't produce a light intensity (effectively the brightness) which is proportional to the input voltage. This means that different input levels aren't recreated entirely accurately without correction.

The intensity of light reproduced by a CRT display is described, technically, as being ‘proportional to the input voltage raised to the power gamma’, or ‘the exponent of the response to power input’. The upshot of this is that while the extreme ends of the brightness range will be reproduced as they should be, the points in-between, especially the midtones and shadows, will suffer.

This gamma value, resulting from an effect caused by the electrostatic charge in the tube’s electron gun, is typically around 2.4 or 2.5, and is normally said to be 2.5 for simplicity’s sake. As a result, an input brightness level of 50% (midway along the gamma curve) will be presented as a light intensity of just 18% unless some form of correction is applied. (The formula for working out the effect of the gamma on brightness values is simply input raised to the power of gamma = output, so 50% ^ 2.5 = 18%.) Although the bright end of the scale is hardly affected, midtones are reproduced far too darkly and shadows fill in almost completely.

To counter this, when television standards were established, an in-camera gamma correction of 0.45 was arrived at (and has been used ever since) to counter this extreme effect and bring the effective result down to 2.2. This level was chosen because this allows for the typical fairly dim TV-watching environments where, with a more daylight-oriented correction, midtones and shadows would be rather too bright.

Computers are normally used in environments which are a little brighter than the average sitting room, so there ought to be some further compensation applied. This isn't the norm with PCs; the effective corrected gamma setting of a Windows-based PC is generally around 2.2, arrived at by the graphics card and drivers rather than anything standard in the OS itself. This appears to be a historical leftover from the crude TV-style monitors of the early modern computing era. The Mac’s effective gamma setting of 1.8, which renders 50% brightness input as 29% brightness on output, is used because of its long association with print design and production.

Macs have always used specific gamma correction lookup tables in the display hardware. These aren’t set to compensate entirely; midtones and shadows are still rendered slightly more darkly than is purely scientifically accurate. The gamma setting of 1.8 matches the typical output characteristics of printed halftones. Dot gain, where printed dots tend to spread, increases with mid and darker tones (hitting a peak at different percentage levels depending on the precise halftone dot shape), producing a gamma-like effect on mid-tone and shadow print levels which is matched by the Mac’s default 1.8 gamma setting. In short, what we see on the screen will have similar relative brightness values in print.

This difference between Macs and PCs can lead to unexpected colour matching problems. If a colour is made up of 75% red and 25% blue, an uncorrected display will reduce the red value from 75% to 49% and the blue value from 25% to 3%. A display set to an effective gamma of 1.8 will reduce the red value to 60% and the blue to 8%. The ratio between the strength of the red and the strength of the blue is about 16:1 with an uncorrected display and 7:1 on a standard Mac setup. Technically, this still produces a colour shift compared with the original mathematical values, which had a ratio of just 3:1. However, it is roughly the same as the shift produced by standard dot gain effects in commercial halftone printing - which equals no shift at all when comparing the Mac’s display to print.

The gamma value you should you use depends on what kind of work you do. Using the 1.8 level of gamma correction, which is standard for Mac-driven displays, means that the display is corrected to be more accurate in terms of rendering print designs and generally more like the reflective colour tonal values that we see in the real world. If you only ever produce images for the web, where most people will view your work using PCs, you may benefit from working at gamma levels nearer to 2.2. Of course, Mac users may consider your images to be rather washed out by comparison. Alternatively, if your images have embedded profiles then you can be more certain they’ll be displayed correctly by modern browsers on either platform. Certainly, if you do print work your gamma settings should be around 1.8. Finally, Gamma Toggle from can be invaluable, as it can show what images look like on PCs one moment and on Macs the next. Of course, a good general awareness of the issue can be as useful, so develop a sense of what things will shift and by how much.

Have you found the information on this site useful? If you like, you can make a small donation directly to my hosting bills! It would be deeply appreciated.