When manipulating images digitally most software (renderer's, compositors, paint packages etc) assume that your system is linear - that is 8 is twice as powerfull as 4. They need to do this, as they're performing calculations about the ammount of light bouncing round in your digitaly simulated scene (for example). At some point this linear data needs to get converted to the non-linear space that your screen uses.
Ideally you'd simply program the Gamma into your display driver, and then forget about it. On some systems you can do this, but others ignore gamma and feed the digital values from the application stright to the video system - the result is very dark images. Others perform some correction but insuffucient.
To make things worse, upon finding their image is dark, users adjust the lighting to correct. They then transfer the image to video tape through a calibrated system and find it is now washed out! Then they blame the transfer system. Renderering systems oftern allow you to set the gamma of the final image (RiExposure in RenderMan), but setting this for your screen will prevent compositing, and the image may be incorrect on other displays.
SGI and NeXT both handle Gamma correction as part of the windowing system so its not much of a problem, while NT is a total disaster. For that reason, under NT all my apps have a gamma correction function built in (this may not be implemented in all currently released versions). The default value of GAMMA is 2.0 (which looked good on my machine), but you can control it by setting the environment variable GAMMA. DCT apps on non-DOS based platforms assume that you're display driver is doing the right thing and apply no gamma correction by default. However you can still use the GAMMA variable to make DCT apps gamma correct themselves.
Run "gammaCalc", and a checker board pattern appears. By clicking from left to right across the image the relative brightness of the squares changes - click until you find the spot where the brightness of the squares is as uniform as possible, then quit. The value of gamma will be printed to the terminal when you press space (or q to quit). Depending on the version of gammaCalc you're running you may be able to use the right mouse button to check the gamma correction for each channel indepentantly.
For this to work you should set the gamma correction of your display to 1.0 before running the App. Under NT you must set the environment variable GAMMA=1 to disable my display library from double correction (sorry...). Ideally you should now be able to set your display drivers gamma to be the value just calculated, and forget about things. This can be done on most Unix based platforms by using the xgamma command.
OK - lets make things easier...
You've got an image which looked good on some random (uncalibrated)
machine, but under your new nicely calibrated system they're wrong - probably
washed out. You need to find out what gamma correction that machine SHOULD
have had (say 1.9), and apply the inverse. If your machine isn't gamma
correcting either, but you know the value (say 1.6) you can build that
into the conversion also, so the image should look the same as it did on
the remote machine! To do this use "regamma":(
NT,
FreeBSD,
Linux,
SGI)
regamma 1.9 1.6 image.tiff corrected.tiff
Suppose I've got an image that I produced before I discovered gamma
correction. I've found that I need a value of 1.6 for my machine, and I've
set up my display to do that. I can fix my old images so they look like
they used to using:
regamma 1.6 1.0 image.tiff corrected.tiff
I'm converting from an old gamma of 1.6 which was build into the image,
and making the image linear (1.0).
To convert an image for viewing on an uncalibrated system when it has
been produced on a calibrated machine we run:
regamma 1.0 2.2 image.tiff corrected.tiff
where 2.2 is the gamma of the non-correcting machine.
Of course all this image processing is degrading the quality of your image. That's probably OK when you're just viewing it, but repeatedly shuffling it between machines and applying regamma will rapidly degrade the image. One way to avoid this is to use a higher bit depth - say 16bits per sample. Then you can correct and everything will still be accurate enough.
Of course the real answer is to calibrate all your systems and always work with a Gamma of 1.0!
On NT you should find your gamma using gammaCalc, then set the environment variable GAMMA to the result. Then preview all your images using "viewer". Clearly this isn't ideal so an alternative is to get the images looking as good as possible under NT's normal imaging system, then use regamma to convert them to linear (or whatever else is required) when you export them (regamma 2.0 1.0 xxx.tif out.tiff). If this approach is used, then lighting in rendered images may be incorrect. To fix this set the output gamma of your renderer to be your screens gamma, but rememeber that the images should be linearized before comp'ing or exporting.
If you're really serious about colour fidelity then you need to look at the phosphors and gamut of your monitor. At the very least you should really calibrate the GAMMA for each channel seperatly - My Sony Trinitron seems to put too much green into the mid values, sugguesting a lower value of Gamma required for that channel. The very idea of colour is built on some very shakey foundations, and unless you've cash to throw at the problem there's not too much you can do - a decent reference monitor goes for about $7,000 (Read Hall'89 for a cheaper introduction to the concept of colour).
Having said that it can't be done, I do have plans for software to manage colour correction between platforms, but I figured most people would need to get their head round gamma first! As always development will be driven by user feedback...