Terminated due to memory issue when saving altered

2019-09-12 09:06发布

问题:

I am building an app that allows users to add a timestamp to a photo. When a photo from the camera roll is selected my app sometimes crashes and Xcode displays the error: "Message from debugger: Terminated due to memory issue". I found it crashes if I try to add a timestamp to a very large image or if I repeatedly at a timestamp to the same image.

Below is the code I believe is relevant to the issue. Also, here is a link to the entire project.

This is how the timestamp gets added to the image. I put comments where it crashes. It does not always crash on the same line of code. I can't figure out why my code is making it crash with the memory issue error.

  func textToImage(drawText text: NSString, inImage image: UIImage, atPoint point: CGPoint) -> UIImage {

    let color = UserDefaults.standard.colorForKey(key: "color")!
    let font = UserDefaults.standard.value(forKey: "font") as! String
    let size = UserDefaults.standard.value(forKey: "size") as! Int
    let fontAndSize = UIFont(name: font, size: CGFloat(size))!
    let location = Locations(rawValue: UserDefaults.standard.value(forKey: "location") as! String)!
    let scale = UIScreen.main.scale


    UIGraphicsBeginImageContextWithOptions(image.size, false, scale)
    // sometimes terminates here

    let textFontAttributes = [
      NSFontAttributeName: fontAndSize,
      NSForegroundColorAttributeName: color
      ] as [String : Any]
    image.draw(in: CGRect(origin: CGPoint.zero, size: image.size))

    let rectSize = text.boundingRect(with: CGSize(width:CGFloat(MAXFLOAT), height: CGFloat(MAXFLOAT)), options: NSStringDrawingOptions.usesLineFragmentOrigin, attributes: textFontAttributes, context: nil).size
    // sometimes terminates here
    if location == .topRight || location == .bottomRight || location == .center || location == .topCenter || location == .bottomCenter {
      // Calculate the text size

      if location == .center || location == .topCenter || location == .bottomCenter {
        // Subtract half the text width from the x origin
        let rect = CGRect(x:point.x-(rectSize.width / 2), y:point.y-(rectSize.height / 2), width:rectSize.width, height: rectSize.height)
        text.draw(in: rect, withAttributes: textFontAttributes)
      } else {
        // Subtract the text width from the x origin
        let rect = CGRect(x:point.x-rectSize.width, y:point.y-(rectSize.height / 2), width:rectSize.width, height: rectSize.height)
        text.draw(in: rect, withAttributes: textFontAttributes)
      }
    } else {
      let rect = CGRect(x:point.x, y:point.y-(rectSize.height / 2), width:rectSize.width, height: rectSize.height)
      text.draw(in: rect, withAttributes: textFontAttributes)
    }

    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return newImage!
  }

  func selectPhotoFromCameraRoll(mediaType: String) {
    imagePicker.sourceType = .photoLibrary
    if mediaType == "Video" {
      imagePicker.mediaTypes = [kUTTypeMovie as String]
    }
    newMedia = false
    present(imagePicker, animated: true, completion: nil)
  }

  func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
    if let mediaType = info[UIImagePickerControllerMediaType] as? String {
      if mediaType.isEqual((kUTTypeImage as String)) {
        if let pickedImage = info[UIImagePickerControllerOriginalImage] as? UIImage {
          let width = pickedImage.size.width
          let height = pickedImage.size.height
          let location = Locations(rawValue: UserDefaults.standard.value(forKey: "location") as! String)!
          var point = CGPoint(x: 0, y: 0)
          switch location {
          case .topLeft:
            point = CGPoint(x: 30, y: 50)
          case .topRight:
            point = CGPoint(x: width - 30, y: 50)
          case .bottomLeft:
            point = CGPoint(x: 30, y: height - 50)
          case .bottomRight:
            point = CGPoint(x: width - 30, y: height - 50)
          case .center:
            point = CGPoint(x: width / 2, y: height / 2)
          case .topCenter:
            point = CGPoint(x: width / 2, y: 50)
          case .bottomCenter:
            point = CGPoint(x: width / 2, y: height - 50)
          }
          let savedFormat = UserDefaults.standard.value(forKey: "format") as! String
          var date = Date()
          if !currentDateBool {
            date = UserDefaults.standard.value(forKey: "selectedDate") as! Date
          }
          let timestampText = getFormattedDateFromFormatType(formatType: savedFormat, date: date) as NSString
          let timestampImage = textToImage(drawText: timestampText, inImage: pickedImage, atPoint: point)
          if newMedia {
            UIImageWriteToSavedPhotosAlbum(timestampImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
          } else {
            UIImageWriteToSavedPhotosAlbum(timestampImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
          }
        }
      } else if mediaType.isEqual((kUTTypeMovie as String)) {
        if let videoUrl = info[UIImagePickerControllerMediaURL] as? NSURL {
          if let videoPath = videoUrl.relativePath {
            if newMedia {
              UISaveVideoAtPathToSavedPhotosAlbum(videoPath, nil, nil, nil)
            }
          }
        }
      }
    }
    dismiss(animated: true, completion: nil)
  }


  func image(_ image: UIImage, didFinishSavingWithError error: NSError?, contextInfo: UnsafeRawPointer) {
    if let error = error {
      // we got back an error!
      let ac = UIAlertController(title: "Save error", message: error.localizedDescription, preferredStyle: .alert)
      ac.addAction(UIAlertAction(title: "OK", style: .default))
      present(ac, animated: true)
    } else {
      let ac = UIAlertController(title: "Saved!", message: "Your altered image has been saved to your photos.", preferredStyle: .alert)
      ac.addAction(UIAlertAction(title: "OK", style: .default))
      present(ac, animated: true)
    }
  }

回答1:

You are treating a photo in the camera roll as if it were a UIImage. It isn't.

Let's take a worst-case scenario. Suppose we have a 12MP camera. Then your derived UIImage would be 4032x3024 pixels. Now you are asking to make a graphics context with scaling for the screen resolution. Suppose the screen of this device has 3x resolution. Then you are asking for a graphics context 12096x9072, which is 109734912. That's points. To store the color data, you have to multiply that by how much space the color info takes for each point. I don't know how much that is, but we're certainly going to go up another order of magnitude. In any case, no matter how you slice it, that is a boatload of memory you're asking for.