我目前正在制作一个视频拼贴应用程序。我有一个包含 Avplayerlayer 作为子层的 View 。我需要获取包含 AvplayerLayer 的 View 的屏幕截图。当我尝试拍摄它时,它得到了屏幕截图,但 avplayerlayer(视频正在里面播放)不在屏幕截图中,只是一个黑屏。对于模拟器来说,它可以完美地工作并显示图层,但对于真实设备来说只是一个黑屏。
我尝试了 StackOverFlow 和 appleds'developer 文档中的所有解决方案,但没有任何效果。
我尝试过的一些解决方案:
swift: How to take screenshot of AVPlayerLayer()
Screenshot for AVPlayer and Video
https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer
正如您在我的代码中看到的那样,它应该适用于从 View 中获取图像,但对于 avplayerlayer 不起作用。
- (UIImage *)imageFromView:(UIView *)view
{
UIGraphicsBeginImageContext(view.frame.size);
[view drawViewHierarchyInRect:_videoFrame afterScreenUpdates:false];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
NSString *fileExtension = @"png";
NSData *data;
Boolean *isOutputJPS = false;
if (isOutputJPS) {
data = UIImageJPEGRepresentation(image, 0.5);
fileExtension = @"jpg";
}else{
data = UIImagePNGRepresentation(image);
}
UIImage *rasterizedView = [UIImage imageWithData:data];
UIGraphicsEndImageContext();
return rasterizedView;
}
//in the viewController
UIImage *image = [self imageFromView:recordingView];
我现在有点绝望,因为 Avplayerlayer 没有任何解决方案。
当我检查在真实设备中生成的图像时,它只是向我展示了 View ,但对于模拟器,它可以按我的预期工作。
最佳答案
有很多方法可以实现您想做的事情。我发现使用 Assets 图像生成器始终有效。
- (NSImage *)getImageFromAsset:(AVAsset *)myAsset width:(int)theWidth height:(int)theHeight {
Float64 durationSeconds = CMTimeGetSeconds(myAsset.duration);
/// Change frametimetoget section to your specific needs ///
CMTime frametimetoget;
if (durationSeconds <= 20) {
frametimetoget = CMTimeMakeWithSeconds(durationSeconds/2, 600);
} else {
frametimetoget = CMTimeMakeWithSeconds(10, 600);
}
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
imageGenerator.appliesPreferredTrackTransform = YES;
imageGenerator.maximumSize = CGSizeMake(theWidth, theHeight);
NSString *aspect = @"AVAssetImageGeneratorApertureModeEncodedPixels";
imageGenerator.apertureMode = aspect;
/// NSError not handled in this example , you would have to add code ///
NSError *error = nil;
CMTime actualTime;
CGImageRef frameImage = [imageGenerator copyCGImageAtTime:frametimetoget actualTime:&actualTime error:&error];
Float64 myImageWidth = CGImageGetWidth(frameImage);
Float64 myImageHeight = CGImageGetHeight(frameImage);
Float64 ratio = myImageWidth/theWidth;
NSSize imageSize ;
imageSize.width=myImageWidth/ratio;
imageSize.height=myImageHeight/ratio;
/// You may choose to use CGImage and skip below
/// Swap out NSImage (Mac OS x) for the ios equivalence
NSImage * thumbNail = [[NSImage alloc]initWithCGImage:frameImage size:imageSize];
/// CGImageRelease is a must to avoid memory leaks
CGImageRelease(frameImage);
return thumbNail;
}
关于ios - 有没有办法在 AVPlayerLayer 中捕获 View 的屏幕截图?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58270930/