How to store video on iPhone while publishing vide

2019-03-11 02:56发布

Right now I am using RTMPStreamPublisher to publish the video at wowzaserver. It's uploading there successfully, but can anyone tell me how I can store the same video on the iPhone while uploading to the server?

I am using https://github.com/slavavdovichenko/MediaLibDemos, but there is not much documentation available. If I can just store the data that is sent for publication then my work will be successful.

Here is the method they are using to upload the stream, but I can't find a way to store the same video on my iPhone device:

// ACTIONS

-(void)doConnect {
#if 0 // use ffmpeg rtmp 
    NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
    upstream = [[BroadcastStreamClient alloc] init:url  resolution:RESOLUTION_LOW];
    upstream.delegate = self;
    upstream.encoder = [MPMediaEncoder new];
    [upstream start];
    socket = [[RTMPClient alloc] init:host]
    btnConnect.title = @"Disconnect";     
    return;
#endif

#if 0 // use inside RTMPClient instance
    upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
    //upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
    //upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];

#else // use outside RTMPClient instance

    if (!socket) {
        socket = [[RTMPClient alloc] init:hostTextField.text];
        if (!socket) {
            [self showAlert:@"Socket has not be created"];
            return;
        }
        [socket spawnSocketThread];
   }
    upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif

    [upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
    //[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
    //[upstream setVideoBitrate:512000];
    upstream.delegate = self;
    [upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
    //[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
    //[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
    btnConnect.title = @"Disconnect";     
}

I did find that with the instance of BroadcastStreamClient named as "upstream" I can get the AVCaptureSession via the following line

[upstream getCaptureSession];

How can I use this AVCaptureSession for recording the video on the iPhone?

3条回答
趁早两清
2楼-- · 2019-03-11 03:16

You could possibly cache the video with NSKeyedArchiver as long as your object conforms <NSCoding> protocol (many collection types already conform to this). This cache directory gets periodically cleaned out

http://khanlou.com/2015/07/cache-me-if-you-can/

Or maybe a better alternative may be to use NSTemporaryDirectory() to store your temporary video file (this is part of NSPathUtilities) locally. AVAssetExportSession should allow you to put the video file there:

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputFileURL;
查看更多
等我变得足够好
3楼-- · 2019-03-11 03:17

Try this when you selected video from photo album or recorded. or in didFinishPickingMediaWithInfo method

NSURL __block *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];

NSString *moviePath = [videoUrl path];

if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) {
             UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
 }
查看更多
成全新的幸福
4楼-- · 2019-03-11 03:42

Once you got the AVCaptureSession you can add to it an instance of AVCaptureMovieFileOutput like this:

AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new];
if([captureSession canAddOutput:movieFileOutput]){
    [captureSession addOutput:movieFileOutput];
}

// Start recording
NSURL *outputURL = …
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

Source: https://www.objc.io/issues/23-video/capturing-video/

Also take a look at this in order to better understand how to use an AVCaptureFileOutput: https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput

查看更多
登录 后发表回答