CoreAnimation, AVFoundation and ability to make Vi

2020-05-12 05:27发布

I'm looking for the correct way to export my pictures sequence into a quicktime video.

I know that AV Foundation have the ability to merge or recombine videos and also to add audio track building a single video Asset.

Now ... my goal is a little bit different. I would to create a video from scratch. I have a set of UIImage and I need to render all of them in a single video. I read all the Apple Documentation about AV Foundation and i found the AVVideoCompositionCoreAnimationTool class that have the ability to take a CoreAnimation and reencode it as a video. I also checked the AVEditDemo project provided by Apple but something seems not working on my project.

Here my steps:

1) I create the CoreAnimation Layer

CALayer *animationLayer = [CALayer layer];
[animationLayer setFrame:CGRectMake(0, 0, 1024, 768)];

CALayer *backgroundLayer = [CALayer layer];
[backgroundLayer setFrame:animationLayer.frame];
[backgroundLayer setBackgroundColor:[UIColor blackColor].CGColor];

CALayer *anImageLayer = [CALayer layer];
[anImageLayer setFrame:animationLayer.frame];

CAKeyframeAnimation *changeImageAnimation = [CAKeyframeAnimation animationWithKeyPath:@"contents"];
[changeImageAnimation setDelegate:self];
changeImageAnimation.duration = [[albumSettings transitionTime] floatValue] * [uiImagesArray count];
changeImageAnimation.repeatCount = 1;
changeImageAnimation.values = [NSArray arrayWithArray:uiImagesArray];
changeImageAnimation.removedOnCompletion = YES;
[anImageLayer addAnimation:changeImageAnimation forKey:nil];

[animationLayer addSublayer:anImageLayer];

2) Than I instantiate the AVComposition

AVMutableComposition *composition = [AVMutableComposition composition];
composition.naturalSize = CGSizeMake(1024, 768);

CALayer *wrapLayer = [CALayer layer];
wrapLayer.frame = CGRectMake(0, 0, 1024, 768);
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0, 0, 1024, 768);
[wrapLayer addSublayer:animationLayer];
[wrapLayer addSublayer:videoLayer];

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];


AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMake([imagesFilePath count] * [[albumSettings transitionTime] intValue] * 25, 25));

AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstruction];
videoCompositionInstruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];

videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:wrapLayer];
videoComposition.frameDuration = CMTimeMake(1, 25); // 25 fps
videoComposition.renderSize = CGSizeMake(1024, 768);
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];

3) I export the video to document path

AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetLowQuality];
session.videoComposition = videoComposition;

NSString *filePath = nil;
filePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
filePath = [filePath stringByAppendingPathComponent:@"Output.mov"];    

session.outputURL = [NSURL fileURLWithPath:filePath];
session.outputFileType = AVFileTypeQuickTimeMovie;

[session exportAsynchronouslyWithCompletionHandler:^
 {
     dispatch_async(dispatch_get_main_queue(), ^{
         NSLog(@"Export Finished: %@", session.error);
         if (session.error) {
             [[NSFileManager defaultManager] removeItemAtPath:filePath error:NULL];
         }
     });
 }];

At the and of the export a get this error:

Export Finished: Error Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo=0x49a97c0 {NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open}

I found it inside documentation: AVErrorInvalidSourceMedia = -11822,

AVErrorInvalidSourceMedia The operation could not be completed because some source media could not be read.

I'm totally sure that the CoreAnimation build by me is right because I rendered it into a test layer and i could see the animation progress correctly.

Anyone can help me to understand where is my error?

2条回答
唯我独甜
2楼-- · 2020-05-12 05:47

maybe you need a fake movie that contains totally black frame to fill the video layer, and then add a CALayer to manpiulate the images

查看更多
Lonely孤独者°
3楼-- · 2020-05-12 05:57

I found the AVVideoCompositionCoreAnimationTool class that have the ability to take a CoreAnimation and reencode it as a video

My understanding was that this instead was only able to take CoreAnimation and add it to an existing video. I just checked the docs, and the only methods available require a video layer too.

EDIT: yep. Digging in docs and WWDC videos, I think you should be using AVAssetWriter instead, and appending images to the writer. Something like:

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:somePath] fileType:AVFileTypeQuickTimeMovie error:&error];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:320], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:CMTimeMakeWithSeconds(0, 30)]
[writerInput appendSampleBuffer:sampleBuffer];
[writerInput markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMakeWithSeconds(60, 30)];
[videoWriter finishWriting];
查看更多
登录 后发表回答