How to Get Video from a Live Photo in iOS

2019-03-28 01:51发布

问题:

I'm trying to figure it out, but can't find any useful information. I only found this:

PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes, 
toFile: fileURL, options: nil, completionHandler: 
{
     // Video file has been written to path specified via fileURL
}

but I'm ashamed to say I have no idea how to play it out. I've created a UIImagePickerController and loaded an Image from the Camera Roll.

回答1:

Use this code to get the video from live photo:

- (void)videoUrlForLivePhotoAsset:(PHAsset*)asset withCompletionBlock:(void (^)(NSURL* url))completionBlock{
    if([asset isKindOfClass:[PHAsset class]]){
        NSString* identifier = [(PHAsset*)asset localIdentifier];
        NSString* filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",[NSString stringWithFormat:@"%.0f",[[NSDate date] timeIntervalSince1970]]]];
        NSURL *fileUrl = [NSURL fileURLWithPath:filePath];

        PHLivePhotoRequestOptions* options = [PHLivePhotoRequestOptions new];
        options.deliveryMode = PHImageRequestOptionsDeliveryModeFastFormat;
        options.networkAccessAllowed = YES;
        [[PHImageManager defaultManager] requestLivePhotoForAsset:asset targetSize:[UIScreen mainScreen].bounds.size contentMode:PHImageContentModeDefault options:options resultHandler:^(PHLivePhoto * _Nullable livePhoto, NSDictionary * _Nullable info) {
            if(livePhoto){
                NSArray* assetResources = [PHAssetResource assetResourcesForLivePhoto:livePhoto];
                PHAssetResource* videoResource = nil;
                for(PHAssetResource* resource in assetResources){
                    if (resource.type == PHAssetResourceTypePairedVideo) {
                        videoResource = resource;
                        break;
                    }
                }
                if(videoResource){
                    [[PHAssetResourceManager defaultManager] writeDataForAssetResource:videoResource toFile:fileUrl options:nil completionHandler:^(NSError * _Nullable error) {
                        if(!error){
                            completionBlock(fileUrl);
                        }else{
                            completionBlock(nil);
                        }
                    }];
                }else{
                    completionBlock(nil);
                }
            }else{
                completionBlock(nil);
            }
        }];
    }else{
        completionBlock(nil);
    }
}

Basically what you have to do is that you first need to fetch the PHLivePhoto object from your PHAsset. After that, you will have to traverse all the asset resources within your live photo and check if it is of type PHAssetResourceTypePairedVideo.

If yes, you got your video. Now you will require to save it to some temporary directory as I did here and use this file for whatever purpose you may have.

To Play this video, you can use the following code:

NSURL *videoURL = [NSURL fileURLWithPath:fileUrl];
AVPlayer *player = [AVPlayer playerWithURL:videoURL];
AVPlayerViewController *playerViewController = [AVPlayerViewController new];
playerViewController.player = player;
[self presentViewController:playerViewController animated:YES completion:nil];

Feel free to ask if you need any clarification.

P.S.- I made a few changes in this method to remove dependency of my application's code so the above code is untested, however I feel it should work as expected.



回答2:

The question is a little confusing

First, If you want to pick live photo and play live photo.I recommend you use the Photos Framework instead of UIImagePickerController. This way you can fetch the asset and have more control. Then you can play the live photo as mov or the muted version with PHLivePhotoView by setting the startPlayback(with:) to hint or full.

You can refer the code here :

  • a github repo LivePreview show you how to select live photo and play it.

Second, if you want convert live photo to mov, the code you pasted will work, and if you want to play mov directly, you may want use AVPlayer

Plus, WWDC provides Example app using Photos framework



回答3:

Swift 4 version

import Photos
import MobileCoreServices

// <UIImagePickerControllerDelegate, UINavigationControllerDelegate>
@IBAction func showImagePicker(sender: UIButton) {
    let picker = UIImagePickerController()
    picker.delegate = self;
    picker.allowsEditing = false;
    picker.sourceType = .photoLibrary;
    picker.mediaTypes = [kUTTypeLivePhoto as String, kUTTypeImage as String];

    present(picker, animated: true, completion: nil);
}

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
    guard
        let livePhoto = info[UIImagePickerControllerLivePhoto] as? PHLivePhoto,
        let photoDir = generateFolderForLivePhotoResources()
        else {
            return;
    }

    let assetResources = PHAssetResource.assetResources(for: livePhoto)
    for resource in assetResources {

        // SAVE FROM BUFFER
//            let buffer = NSMutableData()
//            PHAssetResourceManager.default().requestData(for: resource, options: nil, dataReceivedHandler: { (chunk) in
//                buffer.append(chunk)
//            }, completionHandler: {[weak self] error in
//                self?.saveAssetResource(resource: resource, inDirectory: photoDir, buffer: buffer, maybeError: error)
//            })

        // SAVE DIRECTLY
        saveAssetResource(resource: resource, inDirectory: photoDir, buffer: nil, maybeError: nil)
    }

    picker.dismiss(animated: true) {}
}

func saveAssetResource(
    resource: PHAssetResource,
    inDirectory: NSURL,
    buffer: NSMutableData?, maybeError: Error?
    ) -> Void {
    guard maybeError == nil else {
        print("Could not request data for resource: \(resource), error: \(String(describing: maybeError))")
        return
    }

    let maybeExt = UTTypeCopyPreferredTagWithClass(
        resource.uniformTypeIdentifier as CFString,
        kUTTagClassFilenameExtension
        )?.takeRetainedValue()

    guard let ext = maybeExt else {
        return
    }

    guard var fileUrl = inDirectory.appendingPathComponent(NSUUID().uuidString) else {
        print("file url error")
        return
    }

    fileUrl = fileUrl.appendingPathExtension(ext as String)

    if let buffer = buffer, buffer.write(to: fileUrl, atomically: true) {
        print("Saved resource form buffer \(resource) to filepath \(String(describing: fileUrl))")
    } else {
        PHAssetResourceManager.default().writeData(for: resource, toFile: fileUrl, options: nil) { (error) in
            print("Saved resource directly \(resource) to filepath \(String(describing: fileUrl))")
        }
    }
}

func generateFolderForLivePhotoResources() -> NSURL? {
    let photoDir = NSURL(
        // NB: Files in NSTemporaryDirectory() are automatically cleaned up by the OS
        fileURLWithPath: NSTemporaryDirectory(),
        isDirectory: true
        ).appendingPathComponent(NSUUID().uuidString)

    let fileManager = FileManager()
    // we need to specify type as ()? as otherwise the compiler generates a warning
    let success : ()? = try? fileManager.createDirectory(
        at: photoDir!,
        withIntermediateDirectories: true,
        attributes: nil
    )

    return success != nil ? photoDir! as NSURL : nil
}

In depth tutorial here Live Photo API on iOS