gpt4 book ai didi

ios - 如何将视频从 iOS 设备发送到服务器?

转载 作者:可可西里 更新时间:2023-11-01 05:00:19 25 4
gpt4 key购买 nike

我必须将视频从 iPhone 实时发送到服务器。我创建捕获 session 并使用 AVCaptureMovieFileOutput。

NSError *error = nil;



captureSession = [[AVCaptureSession alloc] init];
//查找、附加设备
AVCaptureDevice *muxedDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeMuxed];
如果(多路复用设备){
NSLog (@"得到 muxedDevice");
AVCaptureDeviceInput *muxedInput = [AVCaptureDeviceInput deviceInputWithDevice:muxedDevice
错误:&error];
如果(多路输入){
[captureSession addInput:muxedInput];
}
} 别的 {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
如果(视频设备){
NSLog(@"得到视频设备");
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice
错误:&error];
如果(视频输入){
[捕获 session 添加输入:视频输入];
}
}
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio];
如果(音频设备){
NSLog(@"获取音频设备");
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice
错误:&error];
如果(音频输入){
[captureSession addInput: audioInput];
}
}
}

//从 session 中创建一个预览层并将其添加到 UI
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.layer.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
previewLayer.orientation = AVCaptureVideoOrientationPortrait;
[view.layer addSublayer:previewLayer];

//创建捕获文件输出

captureMovieOutput = [[AVCaptureMovieFileOutput 分配] 初始化];
如果(!captureMovieURL){
captureMoviePath = [[self getMoviePathWithName:MOVIE_FILE_NAME] 保留];
captureMovieURL = [[NSURL alloc] initFileURLWithPath:captureMoviePath];
}
NSLog (@"录制到 %@", captureMovieURL);
[captureSession addOutput:captureMovieOutput];

我使用 AVAssetExportSession 获取时长为 10 秒的视频。

     AVURLAsset *asset = [AVURLAsset URLAssetWithURL:captureMovieURL options:[NSDictionary  dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];<p></p>

<pre><code>AVMutableComposition *composition = [AVMutableComposition composition];

CMTime endTime;
CMTime duration = CMTimeMake(6000, 600);
if (asset.duration.value - startFragment.value < 6000)
{
endTime = asset.duration;
}
else
{
endTime = CMTimeMake(startFragment.value + 6000, 600);
}
CMTimeRange editRange = CMTimeRangeMake(startFragment, duration);
startFragment = CMTimeMake(endTime.value, 600);
NSError *editError = nil;
// and add into your composition
</code></pre>

<p>[composition insertTimeRange:editRange ofAsset:asset atTime:composition.duration error:&editError];</p>

<code> AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetPassthrough];

exportSession.shouldOptimizeForNetworkUse = YES;
NSString *name = [NSString stringWithFormat:MOVUE_SEGMENT_NAME, countMovies];
NSString *path = [NSString stringWithFormat:@"file://localhost%@", [self getMoviePathWithName:name]];
NSURL *url = [NSURL URLWithString:path];
NSLog(@"urlsegment = %@", url);
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = url;
[exportSession exportAsynchronouslyWithCompletionHandler:^{

if (AVAssetExportSessionStatusCompleted == exportSession.status)
{
countMovies++;

NSLog(@"AVAssetExportSessionStatusCompleted");

}
else if (AVAssetExportSessionStatusFailed == exportSession.status)
{
NSLog(@"AVAssetExportSessionStatusFailed: %@", [exportSession.error localizedDescription]);

}
else
{
NSLog(@"Export Session Status: %d", exportSession.status);
}
}];
</code>


如果导出 session 状态已完成,我会将视频发送到服务器。但它很慢。要获取持续时间为 10 秒的电影,然后将其发送到服务器需要 15 秒。如果影片的长度小于 10 秒,则什么都不会改变。我怎么解决这个问题?做这个的最好方式是什么?我怎么解决这个问题?在服务器上流式传输视频有什么更好的用途?

最佳答案

使用ffmpeg编码元数据,它可能比AVAssetExportSession更好。但是ffmpeg编码比AVAssetExportSession困难得多;

关于ios - 如何将视频从 iOS 设备发送到服务器?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9391734/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com