Suppose 5 samples of hue are taken using a simple HSV model for color, having values 355, 5, 5, 5, 5, all a hue of red and "next" to each other as far as perception is concerned. But the simple average is 75 which is far away from 0 or 360, close to a yellow-green.
What is a better way to calculate this mean and associated std?
The simple solution is to convert those angles to a set of vectors, from polar coordinates into cartesian coordinates.
Since you are working with colors, think of this as a conversion into the (a*,b*) plane. Then take the mean of those coordinates, and then revert back into polar form again. Done in matlab,
theta = [355,5,5,5,5];
x = cosd(theta); % cosine in terms of degrees
y = sind(theta); % sine with a degree argument
Now, take the mean of x and y, compute the angle, then
convert back from radians to degrees.
meanangle = atan2(mean(y),mean(x))*180/pi
meanangle =
3.0049
Of course, this solution is valid only for the mean angle. As you can see, it yields a consistent result with the mean of the angles directly, where I recognize that 355 degrees really wraps to -5 degrees.
mean([-5 5 5 5 5])
ans =
3
To compute the standard deviation, it is simplest to do it as
std([-5 5 5 5 5])
ans =
4.4721
Yes, that requires me to do the wrap explicitly.