I'd like to add the ability to adjust screen gamma at application startup and reset it at exit. While it's debatable whether one should tamper with gamma at all (personal I find it useless and detrimental), but hey, some people expect being able to do that kind of thing.
It's just one simple API call too, so it's all easy, right?
MSDN says: "The gamma ramp is specified in three arrays of 256 WORD elements each [...] values must be stored in the most significant bits of each WORD to increase DAC independence.". This means, in my understanding, something like word_value = byte_value<<8
, which sounds rather weird, but it's how I read it.
The Doom3 source code contains a function that takes three arrays of char
values and converts them into an array of uint16_t
values that have the same byte value both in the upper and lower half. In other words something like word_value = (byte_value<<8)|byte_value
. This is equally weird, but what's worse it is not the same as above.
Also there exist a few of code snippets on the internet on various hobby programmer sites (apparently one stolen from the other, because they're identical to the letter) which do some obscure math multiplying the linear index with a value, biasing with 128, and clamping to 65535. I'm not quite sure what this is about, but it looks like total nonsense to me, and again it is not the same as either of the above two.
What gives? It must be well-defined -- without guessing -- how the data that you supply must look like? In the end, what one will do is read the original values and let the user tweak some sliders anyway (and optionally save that blob to disk with the user's config), but still... in order to modify these values, one needs to know what they are and what's expected.
Has anyone done (and tested!) this before and knows which one is right?
While investigating the ability to change screen brightness programmatically, I came across this article Changing the screen brightness programmingly - By using the Gama Ramp API.
Using the debugger, I took a look at the values provided by the GetDeviceGamaRamp()
function. The output is a two dimensional array defined as something like WORD GammaArray[3][256];
and is a table of 256 values to modify the Red, Green, and Blue values of displayed pixels. The values I saw started with a value of zero (0) at index 0 and adding a value of 256 to calculate the next value. So the sequence is 0, 256, 512, ..., 65024, 65280 for indices 0, 1, 2, ..., 254, 255.
My understanding is that these values are used to modify the RGB value for each pixel. By modifying the table value you can modify the display brightness. However the effectiveness of this technique may vary depending on display hardware.
You may find this brief article, Gamma Controls, of interest as it describes gamma ramp levels though from a Direct3D perspective. The article has this to say about Gamma Ramp Levels.
In Direct3D, the term gamma ramp describes a set of values that map
the level of a particular color component—red, green, blue—for all
pixels in the frame buffer to new levels that are received by the DAC
for display. The remapping is performed by way of three look-up
tables, one for each color component.
Here's how it works: Direct3D takes a pixel from the frame buffer and
evaluates its individual red, green, and blue color components. Each
component is represented by a value from 0 to 65535. Direct3D takes
the original value and uses it to index a 256-element array (the
ramp), where each element contains a value that replaces the original
one. Direct3D performs this look-up and replace process for each color
component of each pixel in the frame buffer, thereby changing the
final colors for all of the on-screen pixels.
According to the online documentation for GetDeviceGamaRamp()
and SetDeviceGamaRamp()
these functions are supported in the Windows API beginning with Windows 2000 Professional.
I used their source condensed down to the following example inserted into a Windows application to test the effect using values from the referenced article. My testing was done with Windows 7 and an AMD Radeon HD 7450 Graphics adapter.
With this test both of my displays, I have two displays, were affected.
//Generate the 256-colors array for the specified wBrightness value.
WORD GammaArray[3][256];
HDC hGammaDC = ::GetDC(NULL);
WORD wBrightness;
::GetDeviceGammaRamp (hGammaDC, GammaArray);
wBrightness = 64; // reduce the brightness
for (int ik = 0; ik < 256; ik++) {
int iArrayValue = ik * (wBrightness + 128);
if (iArrayValue > 0xffff) iArrayValue = 0xffff;
GammaArray[0][ik] = (WORD)iArrayValue;
GammaArray[1][ik] = (WORD)iArrayValue;
GammaArray[2][ik] = (WORD)iArrayValue;
}
::SetDeviceGammaRamp (hGammaDC, GammaArray);
Sleep (3000);
wBrightness = 128; // set the brightness back to normal
for (int ik = 0; ik < 256; ik++) {
int iArrayValue = ik * (wBrightness + 128);
if (iArrayValue > 0xffff) iArrayValue = 0xffff;
GammaArray[0][ik] = (WORD)iArrayValue;
GammaArray[1][ik] = (WORD)iArrayValue;
GammaArray[2][ik] = (WORD)iArrayValue;
}
::SetDeviceGammaRamp (hGammaDC, GammaArray);
Sleep (3000);
::ReleaseDC(NULL, hGammaDC);
As an additional note, I made a slight change to the above source so that instead of modifying each of the RGB values equally, I commented out the first two assignments so that only GammaArray[2][ik]
was modified. The result was a yellowish cast to the display.
I also tried putting the above source in a loop to check how the display changed and it was quite a difference from wBrightness=0
to wBrightness=128
.
for (wBrightness = 0; wBrightness <= 128; wBrightness += 16) {
for (int ik = 0; ik < 256; ik++) {
int iArrayValue = ik * (wBrightness + 128);
if (iArrayValue > 0xffff) iArrayValue = 0xffff;
GammaArray[0][ik] = (WORD)iArrayValue;
GammaArray[1][ik] = (WORD)iArrayValue;
GammaArray[2][ik] = (WORD)iArrayValue;
}
::SetDeviceGammaRamp (hGammaDC, GammaArray);
Sleep (3000);
}
Microsoft provides an on-line MSDN article, Using gamma correction, that is part of the Direct3D documentation which describes the basics of gamma as follows:
At the end of the graphics pipeline, just where the image leaves the
computer to make its journey along the monitor cable, there is a small
piece of hardware that can transform pixel values on the fly. This
hardware typically uses a lookup table to transform the pixels. This
hardware uses the red, green and blue values that come from the
surface to be displayed to look up gamma-corrected values in the table
and then sends the corrected values to the monitor instead of the
actual surface values. So, this lookup table is an opportunity to
replace any color with any other color. While the table has that level
of power, the typical usage is to tweak images subtly to compensate
for differences in the monitor’s response. The monitor’s response is
the function that relates the numerical value of the red, green and
blue components of a pixel with that pixel’s displayed brightness.
Additionally the software application Redshift has a page Windows gamma adjustments which has this to say about Microsoft Windows.
When porting Redshift to Windows I ran into trouble when setting a
color temperature lower than about 4500K. The problem is that Windows
sets limitations on what kinds of gamma adjustments can be made,
probably as a means of protecting the user against evil programs that
invert the colors, blank the display, or play some other annoying
trick with the gamma ramps. This kind of limitation is perhaps
understandable, but the problem is the complete lack of documentation
of this feature (SetDeviceGammaRamp on MSDN). A program that tries to
set a gamma ramp that is not allowed will simply fail with a generic
error leaving the programmer wondering what went wrong.
I haven't tested this, but if I had to guess, early graphics cards were non-standard in their implementation of SetDeviceGammaRamp() when Doom was written and sometimes used the LOBYTE and sometimes used the HIBYTE of the WORD value. The consensus moved to only using the HIBYTE, hence the word_value = byte_value<<8
.
Here's another datapoint, from the PsychoPy library (in python) which is just swapping LOBYTE and HIBYTE:
"""Sets the hardware look-up table, using platform-specific ctypes functions.
For use with pyglet windows only (pygame has its own routines for this).
Ramp should be provided as 3x256 or 3x1024 array in range 0:1.0
"""
if sys.platform=='win32':
newRamp= (255*newRamp).astype(numpy.uint16)
newRamp.byteswap(True)#necessary, according to pyglet post from Martin Spacek
success = windll.gdi32.SetDeviceGammaRamp(pygletWindow._dc, newRamp.ctypes)
if not success: raise AssertionError, 'SetDeviceGammaRamp failed'
It also appears that Windows doesn't allow all gamma settings, see:
http://jonls.dk/2010/09/windows-gamma-adjustments/
Update:
The first Windows APIs to offer gamma control are Windows Graphics Device Interface (GDI)’s SetDeviceGammaRamp and GetDeviceGammaRamp. These APIs work with three 256-entry arrays of WORDs, with each WORD encoding zero up to one, represented by WORD values 0 and 65535. The extra precision of a WORD typically isn’t available in actual hardware lookup tables, but these APIs were intended to be flexible. These APIs, in contrast to the others described later in this section, allow only a small deviation from an identity function. In fact, any entry in the ramp must be within 32768 of the identity value. This restriction means that no app can turn the display completely black or to some other unreadable color.
http://msdn.microsoft.com/en-us/library/windows/desktop/jj635732(v=vs.85).aspx