Get all frames of Video IOS 6

2019-01-23 00:13发布

问题:

I would like to get all frames of a Video in iOS6 into NSArray. I use this code:

-(void) getAllImagesFromVideo
{
   imagesArray = [[NSMutableArray alloc] init];
   times = [[NSMutableArray alloc] init];

   for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
   {
       CMTime time = CMTimeMakeWithSeconds(i, 60);
      [times addObject:[NSValue valueWithCMTime:time]];
   } 

   [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {

      if (result == AVAssetImageGeneratorSucceeded)
      {
         UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];

         [imagesArray addObject:generatedImage];
     }
   }];
}

On iPad simulator the delay is 90~100 seconds, on iPad device, recieved Memory Warnings and finally crash.

Any idea, solution? Using another more low-level Framework/Library? C++? Is very important for me! Help me! :)

Thanks!!!

回答1:

You need to:

  1. use CGImageRelease as @zimmryan mentioned
  2. use a block of @autoreleasepool in your loop
  3. do not store the images in the memory, store them in your Document Directory

This is how I do it: (notice I use the sync version for extracting images, it shouldn't matter if you choose the async version)

AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter =  kCMTimeZero;
generator.requestedTimeToleranceBefore =  kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  FPS ; i++){
  @autoreleasepool {
    CMTime time = CMTimeMake(i, FPS);
    NSError *err;
    CMTime actualTime;
    CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
    UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
    [self saveImage: generatedImage atTime:actualTime]; // Saves the image on document directory and not memory
    CGImageRelease(image);
  }
}

EDIT:

you should always consider using @autoreleasepool when creating alot of temporary objects (see https://developer.apple.com/library/mac/documentation/cocoa/conceptual/memorymgmt/articles/mmAutoreleasePools.html)



回答2:

Sounds like you are running into memory issues from 375 images. Try this instead, it may provide better memory management.

-(void) getAllImagesFromVideo
{
   imagesArray = [[NSMutableArray alloc] initWithCapacity:375];
   times = [[NSMutableArray alloc] initWithCapacity:375];

   for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
   {
       [times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]];
   } 

   [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
       if (result == AVAssetImageGeneratorSucceeded)
       {
           [imagesArray addObject:[UIImage imageWithCGImage:image]];
           CGImageRelease(image);
       }
   }];
}