how to correct orientation of video in objective c

2019-05-10 22:32发布

I am using UIImagePickerController to capture video in iPhone but video is showing rotated 90 digree to right.

How to fix this issue

Is there any way by which we can correct the orientation.

2条回答
狗以群分
2楼-- · 2019-05-10 23:00

you need make the re-encode video using AVExportSession to rotate video to correct orientation like said Oleksiy Ivanov. So you need something like this:

NSError *error = nil;
AVURLAsset *videoAssetURL = [[AVURLAsset alloc] initWithURL:self.videoUrl options:nil]; 

AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

AVAssetTrack *videoTrack = [[videoAssetURL tracksWithMediaType:AVMediaTypeVideo] firstObject];
AVAssetTrack *audioTrack = [[videoAssetURL tracksWithMediaType:AVMediaTypeAudio] firstObject];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAssetURL.duration) ofTrack:videoTrack atTime:kCMTimeZero error:&error];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAssetURL.duration) ofTrack:audioTrack atTime:kCMTimeZero error:&error];

CGAffineTransform transformToApply = videoTrack.preferredTransform;
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:transformToApply atTime:kCMTimeZero];
[layerInstruction setOpacity:0.0 atTime:videoAssetURL.duration];

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake( kCMTimeZero, videoAssetURL.duration);
instruction.layerInstructions = @[layerInstruction];

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = @[instruction];
videoComposition.frameDuration = CMTimeMake(1, 30); //select the frames per second
videoComposition.renderScale = 1.0;
videoComposition.renderSize = CGSizeMake(640, 640); //select you video size

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];

exportSession.outputURL = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:@"videoname.MOV"]];
exportSession.outputFileType = AVFileTypeMPEG4; //very important select you video format (AVFileTypeQuickTimeMovie, AVFileTypeMPEG4, etc...)
exportSession.videoComposition = videoComposition;
exportSession.shouldOptimizeForNetworkUse = NO;
exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, videoAssetURL.duration);

[exportSession exportAsynchronouslyWithCompletionHandler:^{

            switch ([exportSession status]) {

                case AVAssetExportSessionStatusCompleted: {

                    NSLog(@"Triming Completed");

                    //generate video thumbnail
                    self.videoUrl = exportSession.outputURL;
                    AVURLAsset *videoAssetURL = [[AVURLAsset alloc] initWithURL:self.videoUrl options:nil];
                    AVAssetImageGenerator *genrateAsset = [[AVAssetImageGenerator alloc] initWithAsset:videoAssetURL];
                    genrateAsset.appliesPreferredTrackTransform = YES;
                    CMTime time = CMTimeMakeWithSeconds(0.0,600);
                    NSError *error = nil;
                    CMTime actualTime;

                    CGImageRef cgImage = [genrateAsset copyCGImageAtTime:time actualTime:&actualTime error:&error];
                    self.videoImage = [[UIImage alloc] initWithCGImage:cgImage];
                    CGImageRelease(cgImage);

                    break;
                }
                default: {
                    break;
                }
            }
        }];

just need change self.videoUrl for you video url and it should work fine :)

查看更多
我命由我不由天
3楼-- · 2019-05-10 23:04

[EDIT] Edited to add description how video re-encoding can be done on device.

When video is recorded with UIImagePickerController it should have orientation embedded in video file as exif flag. Video itself is in portrait orientation. When such video is played on iPhone (or in other way where exif rotation is honoured) the video should be oriented correctly. For example way to play a video explained http://mobile.tutsplus.com/tutorials/iphone/mediaplayer-framework_mpmovieplayercontroller_ios4/ .

If video is played in a custom way then orientation should be extracted from video (for example as described here https://stackoverflow.com/a/9195350/2546685) and applied during playback.

It is also possible to re-encode video using AVExportSession to rotate video to correct orientation and avoid using exif flag during playback. Example how re-encoding can be done: Combining this answer https://stackoverflow.com/a/16314552/2546685 with this one https://stackoverflow.com/a/9195350/2546685 (I did not tried to compile, syntax errors may exists) ->

AVURLAsset *footageVideo = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetTrack *footageVideoTrack = [footageVideo tracksWithMediaType:AVMediaTypeVideo][0];

CGAffineTransform t = footageVideoTrack.preferredTransform;

AVMutableComposition *composition = [AVMutableComposition composition];

AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

[videoCompositionTrack insertTimeRange:footageVideoTrack.timeRange ofTrack: footageVideoTrack atTime:CMTimeMakeWithSeconds(0, NSEC_PER_SEC) error:NULL];

NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {

self.exportSession = [[AVAssetExportSession alloc]
                      initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
// Implementation continues.

NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];

self.exportSession.outputURL = furl;
//provide outputFileType acording to video format extension
self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;
self.exportSession.timeRange = footageVideoTrack.timeRange;

self.self.btnTrim.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{

    switch ([self.exportSession status]) {
        case AVAssetExportSessionStatusFailed:
            NSLog(@"Export failed: %@", [[self.exportSession error] localizedDescription]);
            break;
        case AVAssetExportSessionStatusCancelled:
            NSLog(@"Export canceled");
            break;
        default:
            NSLog(@"Triming Completed");
            dispatch_async(dispatch_get_main_queue(), ^{
                [self.myActivityIndicator stopAnimating];
                self.myActivityIndicator.hidden = YES;
            });

            break;
    }
}];

}
查看更多
登录 后发表回答