Swift - How to save memory using AlamofireImage/Im

2020-02-16 01:45发布

问题:

I'm using Alamofire, AlamofireImage to cache images using ImageShack storage with Firebase backend. Sometimes memory comes over 100MB. I want to learn how to improve my image caching and well scaling before uploading. For example I have declared one common cache in class whole project using that AutoPurgingImageCache:

let photoCache = AutoPurgingImageCache(
        memoryCapacity: 100 * 1024 * 1024,
        preferredMemoryUsageAfterPurge: 60 * 1024 * 1024
    )

Is there any way to define these memoryCapacity, preferredMemoryUsageAfterPurge to save memory? What do they mean exactly?

I'm using below Alamofire.upload for didFinishPickingImage:

func imagePickerController(picker: UIImagePickerController, didFinishPickingImage image: UIImage, editingInfo: [String : AnyObject]?) {

        let urlStr = "http://post.imageshack.us/upload_api.php"
        let url = NSURL(string: urlStr)!
        let imgData = UIImageJPEGRepresentation(image, 0.25)!

        let keyData = "____IMGSHACK_API_KEY_____".dataUsingEncoding(NSUTF8StringEncoding)!
        let keyJSON = "json".dataUsingEncoding(NSUTF8StringEncoding)!
        let publicData = "no".dataUsingEncoding(NSUTF8StringEncoding)!

        Alamofire.upload(.POST, url, headers: nil,
                         multipartFormData: { multipartFormData in
                            multipartFormData.appendBodyPart(data: imgData, name:"fileupload", fileName:"image", mimeType: "image/jpg")
                            multipartFormData.appendBodyPart(data: keyData, name: "key")
                            multipartFormData.appendBodyPart(data: keyJSON, name: "format")
                            multipartFormData.appendBodyPart(data: publicData, name: "public")
            }, encodingCompletion: { encodingResult in
                switch encodingResult {
                case .Success(let upload, _, _):
                    upload.validate()
                    upload.responseJSON { response in
                        let responseJSON = response.result.value as? [String: AnyObject]
                        if let links = responseJSON!["links"] as? Dictionary<String, AnyObject> {
                            if let imgLink = links["image_link"] as? String {
                                self.postToFirebase(imgLink)
                                print(imgLink)
                            }
                        }
                    }
                case .Failure(let encodingError):
                    print(encodingError)
                }
        })

        self.dismissViewControllerAnimated(true, completion: nil)
        self.tableView.contentOffset = CGPointMake(0, -60);

}

I think UIImageJPEGRepresentation(image, 0.25)! is not well compression case, my images comes from internet as full size. If I scale before compression to upload like:

let size = CGSize(width: 500.0, height: 273.0)
let scaledImage = image.af_imageScaledToSize(size)

I think web platform can be bad image quality. So I do scale and caching after image downloading:

    //MARK: - Image Downloading

    func getNetworkImage(urlString: String, comesFrom: String, completion: (UIImage -> Void)) -> (Request) {

        return Alamofire.request(.GET, urlString).responseImage { (response) -> Void in
            guard let image = response.result.value else { return }
            completion(image)

            if comesFrom == "collectionPage" {
                self.cacheCollectionImage(image, urlString: urlString)
            }
            else if comesFrom == "postPage" {
                self.cachePostImage(image, urlString: urlString)
            }

        }
    }

    //MARK: - Image Caching

    func cacheCollectionImage(image: Image, urlString: String) {

        let size = CGSize(width: 188.0, height: 120.0)
        let scaledImage = image.af_imageScaledToSize(size)

        photoCache.addImage(scaledImage, withIdentifier: urlString)

    }

    func cachePostImage(image: Image, urlString: String) {

        let size = CGSize(width: 300.0, height: 165.0)
        let scaledImage = image.af_imageScaledToSize(size)

        photoCache.addImage(scaledImage, withIdentifier: urlString)

    }


    func cachedImage(urlString: String) -> Image? {
        print("cachedImage() func ended")
        return photoCache.imageWithIdentifier(urlString)
    }

Also my application opens with 40MB memory usage. And than I see max top usage 130MB, after uploads and scrolling to turn back AutoPurgingImageCache works and decreasing memory using but I think still so much.

What do you advice me to save memory in these conditions? I need some explanations, what you want to say tools or strategy to handle it as possible as good quality for every platforms.

回答1:

You just have copy & pasted the code from AlamofireImage:

let photoCache = AutoPurgingImageCache(
    memoryCapacity: 100 * 1024 * 1024,
    preferredMemoryUsageAfterPurge: 60 * 1024 * 1024
)

1024 bytes x 1024 bytes x 100 will create a 100 MB in-memory cache. After that limit is reached the cache size will be reduced to 60 MB. If you do not wan't such a big cache then just reduce it. There is no "correct" number here, it depends solely on your use case. I, personally, use 20 MB in memory cache and 100 MB disk cache.