gpt4 book ai didi

ios - 如何在使用 RTMPStreamPublisher 发布视频时将视频存储在 iPhone 上?

转载 作者:技术小花猫 更新时间:2023-10-29 10:10:53 25 4
gpt4 key购买 nike

现在我正在使用 RTMPStreamPublisher 在 wowzaserver 上发布视频。它上传到那里成功了,但是谁能告诉我如何在上传到服务器的同时在 iPhone 上存储相同的视频?

我正在使用 https://github.com/slavavdovichenko/MediaLibDemos , 但没有太多可用的文档。如果我可以只存储发送以供发布的数据,那么我的工作就会成功。

这是他们用来上传流的方法,但我找不到在我的 iPhone 设备上存储相同视频的方法:

// ACTIONS

-(void)doConnect {
#if 0 // use ffmpeg rtmp
NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text];
upstream = [[BroadcastStreamClient alloc] init:url resolution:RESOLUTION_LOW];
upstream.delegate = self;
upstream.encoder = [MPMediaEncoder new];
[upstream start];
socket = [[RTMPClient alloc] init:host]
btnConnect.title = @"Disconnect";
return;
#endif

#if 0 // use inside RTMPClient instance
upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW];
//upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text];
//upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW];

#else // use outside RTMPClient instance

if (!socket) {
socket = [[RTMPClient alloc] init:hostTextField.text];
if (!socket) {
[self showAlert:@"Socket has not be created"];
return;
}
[socket spawnSocketThread];
}
upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW];
#endif

[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
//[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
//[upstream setVideoBitrate:512000];
upstream.delegate = self;
[upstream stream:streamTextField.text publishType:PUBLISH_LIVE];
//[upstream stream:streamTextField.text publishType:PUBLISH_RECORD];
//[upstream stream:streamTextField.text publishType:PUBLISH_APPEND];
btnConnect.title = @"Disconnect";
}

我确实发现,使用名为“upstream”的 BroadcastStreamClient 实例,我可以通过以下行获取 AVCaptureSession

[upstream getCaptureSession];

如何使用此 AVCaptureSession 在 iPhone 上录制视频?

最佳答案

一旦你获得了 AVCaptureSession,你可以像这样向它添加一个 AVCaptureMovieFileOutput 的实例:

AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new];
if([captureSession canAddOutput:movieFileOutput]){
[captureSession addOutput:movieFileOutput];
}

// Start recording
NSURL *outputURL = …
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

来源: https://www.objc.io/issues/23-video/capturing-video/

另请查看此内容以便更好地理解如何使用 AVCaptureFileOutput:https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureFileOutput

关于ios - 如何在使用 RTMPStreamPublisher 发布视频时将视频存储在 iPhone 上?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/20142647/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com