I'm currently working on a script that shows thumbnails from a website on a mobile app, and I'm having problems with the "retina display". If the original image is big enough, I show a thumbnail with the double of the required size, and if it is not, I show it on the required size. Now, my function checks if it can be resized proportionally and if it can't, I resize it to the "min-width" or "min-height" and crop it from the center.
Here is the problem: if it detects that the image can't be shown on the double of size, "scale up" the cropped size until it reaches the limits of the original size, and I can't find a way to scale it right. (My main problem is that I'm not really good at maths :P).
For a more simple lecture:
- This is the original size of the image: 600 x 301
- This is the required/cropped size: 320 x 180
- This is the size I want to get: 535 x 301. I got it from Photoshop, and is the result of scaling 320x180 up until it finds the limit of the original size.
PS: I know GD, so I only need the formula to calculate the size.
The same algorithm that scales down can also scale up. This is untested code, and it is only designed to scale down, but if you remove the "scale down" tests it might do the trick for you.
(Posted on behalf of the OP):
Thanks to Ray Paseur I got this function: