I would like to get all frames of a Video in iOS6 into NSArray
.
I use this code:
-(void) getAllImagesFromVideo
{
imagesArray = [[NSMutableArray alloc] init];
times = [[NSMutableArray alloc] init];
for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
{
CMTime time = CMTimeMakeWithSeconds(i, 60);
[times addObject:[NSValue valueWithCMTime:time]];
}
[imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
if (result == AVAssetImageGeneratorSucceeded)
{
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
[imagesArray addObject:generatedImage];
}
}];
}
On iPad simulator the delay is 90~100 seconds, on iPad device, recieved Memory Warnings and finally crash.
Any idea, solution? Using another more low-level Framework/Library? C++? Is very important for me! Help me! :)
Thanks!!!
You need to:
This is how I do it: (notice I use the sync version for extracting images, it shouldn't matter if you choose the async version)
EDIT:
you should always consider using @autoreleasepool when creating alot of temporary objects (see https://developer.apple.com/library/mac/documentation/cocoa/conceptual/memorymgmt/articles/mmAutoreleasePools.html)
Sounds like you are running into memory issues from 375 images. Try this instead, it may provide better memory management.