I'm writing a web app that takes a user-submitted image, gets the pixel data via a canvas
element, does some processing, and then renders the image using vector shapes (using Protovis). It's working well, but I end up with several thousand colors, and I'd like to let the user pick a target palette size and reduce the color palette to that size.
At the point where I want to reduce the color space, I'm working with an array of RGB pixel data, like this:
[[190,197,190], [202,204,200], [207,214,210], [211,214,211], [205,207,207], ...]
I tried the naive option of just removing least-significant bits from the colors, but the results were pretty bad. I've done some research on color quantization algorithms, but have yet to find a clear description of how to implement one. I could probably work out a cludgy way to send this to the server, run it though an image processing program, and send the resulting palette back, but I'd prefer to do it in JavaScript on the client side.
Does anyone have an example of a clearly explained algorithm that would work here? The goal is to reduce a palette of several thousand colors to a smaller palette optimized for this specific image.
Edit (7/25/11): I took @Pointy's suggestion and implemented (most of) Leptonica's MMCQ (modified median cut quantization) in JavaScript. If you're interested, you can see the code here.
Edit (8/5/11): The clusterfck library looks like another great option for this (though I think it's a bit slower than my implementation).