ios - 尝试使用 PHLivePhotoView objective-c 显示实时照片时图像元数据无效

标签 ios objective-c xcode phlivephoto

我正在尝试在 iOS 设备上使用 objective-c 加载 jpg 图像和 mov 文件以显示实时照片,我制作了以下代码片段在 viewDidLoad 函数中执行此操作:

- (void)viewDidLoad {
    [super viewDidLoad];

    PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];

    NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"jpg"];
    NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:@"livePhoto" withExtension:@"mov"];

    [PHLivePhoto requestLivePhotoWithResourceFileURLs:@[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:@"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
        NSLog(@"we are in handler");
        photoView.livePhoto = livePhoto;
        photoView.contentMode = UIViewContentModeScaleAspectFit;
        photoView.tag = 87;
        [self.view addSubview:photoView];
        [self.view sendSubviewToBack:photoView];
    }];


}

我已将文件 livePhoto.jpglivePhoto.mov 拖到 Xcode 项目中

但是当构建这个 Xcode 时会记录这个错误:

2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler

有什么想法吗?谢谢。

还有一件事要问:

为什么resultHandler根据打印的内容被调用了两次?

最佳答案

长话短说

以下是存储实时照片并将其上传到服务器的代码:
1. 拍摄实况照片

- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
    if (error) {
        [self raiseError:error];
        return;
    }
    NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
    CIImage *image = [CIImage imageWithData:imageData];
    [self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
    [self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output 
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
    if (error) {
        [self raiseError:error];
    } else {
        [self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file
    }
}

expectedAsset 只是一个包含所有必需信息的对象。您可以改用 NSDictionary。由于此代码片段是 >= iOS 11 API,因此这里是“已弃用”iOS 的代码片段...

#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
    if (error) {
        [self raiseError:error];
    } else {
        [self.expectedAsset addInput:[photo metadata]];
        [self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
    }
}
#pragma clang diagnostic pop 


2.生成NSData

- (NSData*)imageData {
        NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
        CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
        NSMutableData *dest_data = [NSMutableData data];
        CFStringRef uti = CGImageSourceGetType(source);
        NSMutableDictionary *maker = [NSMutableDictionary new];
        [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];    // imageMetadata is the dictionary form step 1 above
        CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
        CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
        CGImageDestinationFinalize(destination);
        return dest_data;
    }

- (void)dataRepresentation:(DataRepresentationLoaded)callback {
    callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}

长答案

这是由视频/图像文件中的错误元数据引起的。 PHLivePhoto在创建live photo时,会在kCGImagePropertyMakerAppleDictionary中查找key 17(即 Assets 标识),并将其与com.apple.quicktime.content.identifier进行匹配电影文件。 mov 文件还需要包含捕获静止图像的时间的条目 (com.apple.quicktime.still-image-time)。

确保您的文件没有在某处被编辑(或导出)。事件 UIImageJPEGRepresentation 函数将从图像中删除此数据。

这是我用来将 UIImage 转换为 NSData 的代码片段:

- (NSData*)imageData {
    NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
    CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
    NSMutableData *dest_data = [NSMutableData data];
    CFStringRef uti = CGImageSourceGetType(source);
    NSMutableDictionary *maker = [NSMutableDictionary new];
    [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
    CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
    CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
    CGImageDestinationFinalize(destination);
    return dest_data;
}

Handler 被调用两次,第一次告诉您数据损坏,第二次告诉您进程取消(这是两个不同的键)。

编辑:

这是你的 mov 数据:


    $ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov
    Metadata:
        major_brand     : qt  
        minor_version   : 0
        compatible_brands: qt  
        creation_time   : 2018-01-27T11:07:38.000000Z
        com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816
      Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s
        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
        Metadata:
          creation_time   : 2018-01-27T11:07:38.000000Z
          handler_name    : Core Media Data Handler
          encoder         : 'avc1'
        Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
        Metadata:
          creation_time   : 2018-01-27T11:07:38.000000Z
          handler_name    : Core Media Data Handler

此处缺少 com.apple.quicktime.still-image-time 键。

元数据应该是这样的:


    Metadata:
        major_brand     : qt  
        minor_version   : 0
        compatible_brands: qt  
        creation_time   : 2017-12-15T12:41:00.000000Z
        com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810
        com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/
        com.apple.quicktime.make: Apple
        com.apple.quicktime.model: iPhone X
        com.apple.quicktime.software: 11.1.2
        com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100
      Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s
        Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default)
        Metadata:
          rotate          : 90
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
          encoder         : H.264
        Side data:
          displaymatrix: rotation of -90.00 degrees
        Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
        Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler
        Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default)
        Metadata:
          creation_time   : 2017-12-15T12:41:00.000000Z
          handler_name    : Core Media Data Handler

仅供引用,这是您的 JPEG 数据:


    $ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg
    exif:ColorSpace=1
    exif:ExifImageLength=960
    exif:ExifImageWidth=540
    exif:ExifOffset=26
    exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0

关于ios - 尝试使用 PHLivePhotoView objective-c 显示实时照片时图像元数据无效,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47528440/

相关文章:

ios - 我可以添加 UIbutton 来覆盖 UITableViewController 的一部分吗?

ios - UINavigationController 具有跨所有 ViewController 的统一背景

objective-c - iOS - 数学帮助 - 使用捏合手势缩放基础图像需要覆盖图像调整相对的 X/Y 坐标

ios - 单击电子邮件地址时设置默认主题

ios - 我该如何解决表情符号被分解成字符的问题?

ios - MPMoviePlayerController 未完成加载且不播放流

ios - 从 Swift 中的 URL 中删除所有路径组件

ios - 在 Xcode 6 beta 4 (iOS 8) 中,当我声明它可转换时,为什么 Core Data 不像在 iOS 7 中那样将 UIImage 字段持久化到后端?

ios - Xcode 7 编译错误,无法在 Objective-C 文件中导入 Swift 类

xcode - 如何在没有 xcode 的情况下生成 dSYM 文件?