I need a simple thing: play a video while rotating and applying CIFilter
on it.
First, I create the player item:
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:videoURL];
// DEBUG LOGGING
AVAssetTrack *track = [[item.asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSLog(@"Natural size is: %@", NSStringFromCGSize(track.naturalSize));
NSLog(@"Preffered track transform is: %@", NSStringFromCGAffineTransform(track.preferredTransform));
NSLog(@"Preffered asset transform is: %@", NSStringFromCGAffineTransform(item.asset.preferredTransform));
Then I need to apply the video composition. Originally, I was thinking to create an AVVideoComposition
with 2 instructions - one will be the AVVideoCompositionLayerInstruction
for rotation and the other one will be CIFilter
application. However, I got an exception thrown saying "Expecting video composition to contain only AVCoreImageFilterVideoCompositionInstruction" which means Apple doesn't allow to combine those 2 instructions. As a result, I combined both under the filtering, here is the code:
AVAsset *asset = playerItem.asset;
CGAffineTransform rotation = [self transformForItem:playerItem];
AVVideoComposition *composition = [AVVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {
// Step 1: get the input frame image (screenshot 1)
CIImage *sourceImage = request.sourceImage;
// Step 2: rotate the frame
CIFilter *transformFilter = [CIFilter filterWithName:@"CIAffineTransform"];
[transformFilter setValue:sourceImage forKey: kCIInputImageKey];
[transformFilter setValue: [NSValue valueWithCGAffineTransform: rotation] forKey: kCIInputTransformKey];
sourceImage = transformFilter.outputImage;
CGRect extent = sourceImage.extent;
CGAffineTransform translation = CGAffineTransformMakeTranslation(-extent.origin.x, -extent.origin.y);
[transformFilter setValue:sourceImage forKey: kCIInputImageKey];
[transformFilter setValue: [NSValue valueWithCGAffineTransform: translation] forKey: kCIInputTransformKey];
sourceImage = transformFilter.outputImage;
// Step 3: apply the custom filter chosen by the user
extent = sourceImage.extent;
sourceImage = [sourceImage imageByClampingToExtent];
[filter setValue:sourceImage forKey:kCIInputImageKey];
sourceImage = filter.outputImage;
sourceImage = [sourceImage imageByCroppingToRect:extent];
// Step 4: finish processing the frame (screenshot 2)
[request finishWithImage:sourceImage context:nil];
}];
playerItem.videoComposition = composition;
The screenshots I made during debugging show that the image is successfully rotated and the filter is applied (in this example it was an identity filter which doesn't change the image). Here are the screenshot 1 and screenshot 2 which were taken at the points marked in the comments above:
As you can see, the rotation is successful, the extent of the resulting frame was also correct.
The problem starts when I try to play this video in a player. Here is what I get:
So seems like all the frames are scaled and shifted down. The green area is the empty frame info, when I clamp to extent to make frame infinite size it shows border pixels instead of green. I have a feeling that the player still takes some old size info before rotation from the AVPlayerItem
, that's why in the first code snippet above I was logging the sizes and transforms, there are the logs:
Natural size is: {1920, 1080}
Preffered track transform is: [0, 1, -1, 0, 1080, 0]
Preffered asset transform is: [1, 0, 0, 1, 0, 0]
The player is set up like this:
layer.videoGravity = AVLayerVideoGravityResizeAspectFill;
layer.needsDisplayOnBoundsChange = YES;
PLEASE NOTE the most important thing: this only happens to videos which were recorded by the app itself using camera in landscape iPhone[6s] orientation and saved on the device storage previously. The videos that the app records in portrait mode are totally fine (by the way, the portrait videos got exactly the same size and transform log like landscape videos! strange...maybe iphone puts the rotation info in the video and fixes it). So zooming and shifting the video seems like a combination of "aspect fill" and old resolution info before rotation. By the way, the portrait video frames are shown partially because of scaling to fill the player area which has a different aspect ratio, but this is expected behavior.
Let me know your thoughts on this and, if you know a better way how to accomplish what I need, then it would be great to know.
UPDATE: There comes out to be an easier way to "change" the
AVPlayerItem
video dimensions during playback - set therenderSize
property of video composition (can be done usingAVMutableVideoComposition
class).MY OLD ANSWER BELOW:
After a lot of debugging I understood the problem and found a solution. My initial guess that AVPlayer still considers the video being of the original size was correct. In the image below it is explained what was happening:
As for the solution, I couldn't find a way to change the video size inside
AVAsset
orAVPlayerItem
. So I just manipulated the video to fit the size and scale thatAVPlayer
was expecting, and then when playing in a player with correct aspect ratio and flag to scale and fill the player area - everything looks good. Here is the graphical explanation:And here goes the additional code that needs to be inserted in the
applyingCIFiltersWithHandler
block mentioned in the question: