gpt4 book ai didi

ios - 尝试使用 PHLivePhotoView objective-c 显示实时照片时图像元数据无效

转载 作者:塔克拉玛干 更新时间:2023-11-02 10:21:57 33 4
gpt4 key购买 nike

我正在尝试在 iOS 设备上使用 objective-c 加载 jpg 图像和 mov 文件以显示实时照片,我制作了以下代码片段在 viewDidLoad 函数中执行此操作:

- (void)viewDidLoad {
[super viewDidLoad];

PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];

NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"jpg"];
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"mov"];

[PHLivePhoto requestLivePhotoWithResourceFileURLs:@[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:@"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(@"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
[self.view addSubview:photoView];
[self.view sendSubviewToBack:photoView];
}];


}

我已将文件 livePhoto.jpglivePhoto.mov 拖到 Xcode 项目中

但是当构建这个 Xcode 时会记录这个错误:

2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler

有什么想法吗?谢谢。

还有一件事要问:

为什么resultHandler根据打印的内容被调用了两次?

最佳答案

长话短说

以下是存储实时照片并将其上传到服务器的代码:
1. 拍摄实况照片

- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
if (error) {
[self raiseError:error];
return;
}
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *image = [CIImage imageWithData:imageData];
[self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
[self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file
}
}

expectedAsset 只是一个包含所有必需信息的对象。您可以改用 NSDictionary。由于此代码片段是 >= iOS 11 API,因此这里是“已弃用”iOS 的代码片段...

#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:[photo metadata]];
[self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
}
}
#pragma clang diagnostic pop


2.生成NSData

- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}

- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}

长答案

这是由视频/图像文件中的错误元数据引起的。PHLivePhoto在创建live photo时,会在kCGImagePropertyMakerAppleDictionary中查找key 17(即 Assets 标识),并将其与com.apple.quicktime.content.identifier进行匹配电影文件。 mov 文件还需要包含捕获静止图像的时间的条目 (com.apple.quicktime.still-image-time)。

确保您的文件没有在某处被编辑(或导出)。事件 UIImageJPEGRepresentation 函数将从图像中删除此数据。

这是我用来将 UIImage 转换为 NSData 的代码片段:

- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}

Handler 被调用两次,第一次告诉您数据损坏,第二次告诉您进程取消(这是两个不同的键)。

编辑:

这是你的 mov 数据:

    $ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov    Metadata:        major_brand     : qt          minor_version   : 0        compatible_brands: qt          creation_time   : 2018-01-27T11:07:38.000000Z        com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816      Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)        Metadata:          creation_time   : 2018-01-27T11:07:38.000000Z          handler_name    : Core Media Data Handler          encoder         : 'avc1'        Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)        Metadata:          creation_time   : 2018-01-27T11:07:38.000000Z          handler_name    : Core Media Data Handler

此处缺少 com.apple.quicktime.still-image-time 键。

元数据应该是这样的:

    Metadata:        major_brand     : qt          minor_version   : 0        compatible_brands: qt          creation_time   : 2017-12-15T12:41:00.000000Z        com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810        com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/        com.apple.quicktime.make: Apple        com.apple.quicktime.model: iPhone X        com.apple.quicktime.software: 11.1.2        com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100      Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default)        Metadata:          rotate          : 90          creation_time   : 2017-12-15T12:41:00.000000Z          handler_name    : Core Media Data Handler          encoder         : H.264        Side data:          displaymatrix: rotation of -90.00 degrees        Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default)        Metadata:          creation_time   : 2017-12-15T12:41:00.000000Z          handler_name    : Core Media Data Handler        Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default)        Metadata:          creation_time   : 2017-12-15T12:41:00.000000Z          handler_name    : Core Media Data Handler        Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default)        Metadata:          creation_time   : 2017-12-15T12:41:00.000000Z          handler_name    : Core Media Data Handler

仅供引用,这是您的 JPEG 数据:

    $ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg    exif:ColorSpace=1    exif:ExifImageLength=960    exif:ExifImageWidth=540    exif:ExifOffset=26    exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0

关于ios - 尝试使用 PHLivePhotoView objective-c 显示实时照片时图像元数据无效,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47528440/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com