I'd like to add the ability to adjust screen gamma at application startup and reset it at exit. While it's debatable whether one should tamper with gamma at all (personal I find it useless and detrimental), but hey, some people expect being able to do that kind of thing.
It's just one simple API call too, so it's all easy, right?
MSDN says: "The gamma ramp is specified in three arrays of 256 WORD elements each [...] values must be stored in the most significant bits of each WORD to increase DAC independence.". This means, in my understanding, something like word_value = byte_value<<8
, which sounds rather weird, but it's how I read it.
The Doom3 source code contains a function that takes three arrays of char
values and converts them into an array of uint16_t
values that have the same byte value both in the upper and lower half. In other words something like word_value = (byte_value<<8)|byte_value
. This is equally weird, but what's worse it is not the same as above.
Also there exist a few of code snippets on the internet on various hobby programmer sites (apparently one stolen from the other, because they're identical to the letter) which do some obscure math multiplying the linear index with a value, biasing with 128, and clamping to 65535. I'm not quite sure what this is about, but it looks like total nonsense to me, and again it is not the same as either of the above two.
What gives? It must be well-defined -- without guessing -- how the data that you supply must look like? In the end, what one will do is read the original values and let the user tweak some sliders anyway (and optionally save that blob to disk with the user's config), but still... in order to modify these values, one needs to know what they are and what's expected.
Has anyone done (and tested!) this before and knows which one is right?
While investigating the ability to change screen brightness programmatically, I came across this article Changing the screen brightness programmingly - By using the Gama Ramp API.
Using the debugger, I took a look at the values provided by the
GetDeviceGamaRamp()
function. The output is a two dimensional array defined as something likeWORD GammaArray[3][256];
and is a table of 256 values to modify the Red, Green, and Blue values of displayed pixels. The values I saw started with a value of zero (0) at index 0 and adding a value of 256 to calculate the next value. So the sequence is 0, 256, 512, ..., 65024, 65280 for indices 0, 1, 2, ..., 254, 255.My understanding is that these values are used to modify the RGB value for each pixel. By modifying the table value you can modify the display brightness. However the effectiveness of this technique may vary depending on display hardware.
You may find this brief article, Gamma Controls, of interest as it describes gamma ramp levels though from a Direct3D perspective. The article has this to say about Gamma Ramp Levels.
According to the online documentation for
GetDeviceGamaRamp()
andSetDeviceGamaRamp()
these functions are supported in the Windows API beginning with Windows 2000 Professional.I used their source condensed down to the following example inserted into a Windows application to test the effect using values from the referenced article. My testing was done with Windows 7 and an AMD Radeon HD 7450 Graphics adapter.
With this test both of my displays, I have two displays, were affected.
As an additional note, I made a slight change to the above source so that instead of modifying each of the RGB values equally, I commented out the first two assignments so that only
GammaArray[2][ik]
was modified. The result was a yellowish cast to the display.I also tried putting the above source in a loop to check how the display changed and it was quite a difference from
wBrightness=0
towBrightness=128
.Microsoft provides an on-line MSDN article, Using gamma correction, that is part of the Direct3D documentation which describes the basics of gamma as follows:
Additionally the software application Redshift has a page Windows gamma adjustments which has this to say about Microsoft Windows.
I haven't tested this, but if I had to guess, early graphics cards were non-standard in their implementation of SetDeviceGammaRamp() when Doom was written and sometimes used the LOBYTE and sometimes used the HIBYTE of the WORD value. The consensus moved to only using the HIBYTE, hence the
word_value = byte_value<<8
.Here's another datapoint, from the PsychoPy library (in python) which is just swapping LOBYTE and HIBYTE:
It also appears that Windows doesn't allow all gamma settings, see: http://jonls.dk/2010/09/windows-gamma-adjustments/
Update:
http://msdn.microsoft.com/en-us/library/windows/desktop/jj635732(v=vs.85).aspx