Right now I am using RTMPStreamPublisher
to publish the video at wowzaserver. It's uploading there successfully, but can anyone tell me how I can store the same video on the iPhone while uploading to the server?
I am using https://github.com/slavavdovichenko/MediaLibDemos, but there is not much documentation available. If I can just store the data that is sent for publication then my work will be successful.
Here is the method they are using to upload the stream, but I can't find a way to store the same video on my iPhone device:
// ACTIONS
-(void)doConnect {
#if 0 // use ffmpeg rtmp
NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
upstream = [[BroadcastStreamClient alloc] init:url resolution:RESOLUTION_LOW];
upstream.delegate = self;
upstream.encoder = [MPMediaEncoder new];
[upstream start];
socket = [[RTMPClient alloc] init:host]
btnConnect.title = @"Disconnect";
return;
#endif
#if 0 // use inside RTMPClient instance
upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
//upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
//upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];
#else // use outside RTMPClient instance
if (!socket) {
socket = [[RTMPClient alloc] init:hostTextField.text];
if (!socket) {
[self showAlert:@"Socket has not be created"];
return;
}
[socket spawnSocketThread];
}
upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif
[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
//[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
//[upstream setVideoBitrate:512000];
upstream.delegate = self;
[upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
//[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
//[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
btnConnect.title = @"Disconnect";
}
I did find that with the instance of BroadcastStreamClient
named as "upstream" I can get the AVCaptureSession
via the following line
[upstream getCaptureSession];
How can I use this AVCaptureSession
for recording the video on the iPhone?
You could possibly cache the video with NSKeyedArchiver as long as your object conforms
<NSCoding>
protocol (many collection types already conform to this). This cache directory gets periodically cleaned outhttp://khanlou.com/2015/07/cache-me-if-you-can/
Or maybe a better alternative may be to use
NSTemporaryDirectory()
to store your temporary video file (this is part ofNSPathUtilities
) locally. AVAssetExportSession should allow you to put the video file there:Try this when you selected video from photo album or recorded. or in didFinishPickingMediaWithInfo method
Once you got the
AVCaptureSession
you can add to it an instance ofAVCaptureMovieFileOutput
like this:Source: https://www.objc.io/issues/23-video/capturing-video/
Also take a look at this in order to better understand how to use an
AVCaptureFileOutput
: https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput