I need to scale the resolution of an image coming from a view layer in an iPhone application. The obvious way is to specify a scale factor in UIGraphicsBeginImageContextWithOptions, but any time the scale factor is not 1.0 then quality of the image goes to pot -- far more than would be expected from the loss of pixels.
I've tried several other scaling techniques, but they all seem to revolve around CGContext stuff and all appear to do the same thing.
Simply changing image "size" (without changing the dot resolution) isn't sufficient, mostly because that info seems to be discarded very quickly by other hands in the pipeline (the image will be converted to a JPG and emailed).
Is there any other way to scale an image on iPhone?
Swift extension:
Example:
Set isOpaque to true if the image has no alpha: drawing will have better performance.
I came up with this algorithm to create a half-size image:
I tried just taking every other pixel of every other row, instead of averaging, but it resulted in an image about as bad as the default algorithm.
About UIImage resize problem, this post give many ways to handle UIImage object. The UIImage has some orientation problems need to be fixed. This and Another post will address it.
I suppose you could use something like imagemagick. Apparently it's been successfully ported to iPhone: http://www.imagemagick.org/discourse-server/viewtopic.php?t=14089
I've always been satisfied with the quality of images scaled by this library, so I think you'll be satisfied with the result.