ios - 屏幕截图,包括带有叠加按钮的 AVCaptureVideoPreviewLayer

标签 ios iphone objective-c avcapturesession

我正在使用screen Recorder到屏幕上。当 iPhone 屏幕上充满 View 时,它工作得很好。当 AVCaptureVideoPreviewLayer 显示时带有叠加按钮,则保存的屏幕捕获视频会显示没有 AVCaptureVideoPreviewLayer 的叠加按钮。我用过this添加叠加层的教程。如何解决这个问题?

最佳答案

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

    @autoreleasepool {

            if ([connection isVideoOrientationSupported])
                [connection setVideoOrientation:[self cameraOrientation]];

        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        /*Lock the image buffer*/
        CVPixelBufferLockBaseAddress(imageBuffer,0);
        /*Get information about the image*/
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);

        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        /*Create a CGImageRef from the CVImageBufferRef*/
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);

        /*We release some components*/

        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        CGContextRelease(newContext);
        CGColorSpaceRelease(colorSpace);

        UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];
        image1= [UIImage imageWithCGImage:newImage];
        /*We relase the CGImageRef*/

        CGImageRelease(newImage);


        dispatch_sync(dispatch_get_main_queue(), ^{
            [self.imageView setImage:image1];
        });

    }

}

使用NSTimer运行writeaSample

-(void) writeSample: (NSTimer*) _timer {

    if (assetWriterInput.readyForMoreMediaData) {
        // CMSampleBufferRef sample = nil;
        @autoreleasepool {
            CVReturn cvErr = kCVReturnSuccess;

            // get screenshot image!

            UIGraphicsBeginImageContext(baseViewOne.frame.size);
            [[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
            screenshota = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();


            //CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
            CGImageRef image = (CGImageRef) [screenshota CGImage];
            //NSLog (@"made screenshot");

            // prepare the pixel buffer
            CVPixelBufferRef pixelBuffer = NULL;
            CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
            //NSLog (@"copied image data");
            cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                 baseViewOne.frame.size.width,baseViewOne.frame.size.height,
                                                 kCVPixelFormatType_32BGRA,
                                                 (void*)CFDataGetBytePtr(imageData),
                                                 CGImageGetBytesPerRow(image),
                                                 NULL,
                                                 NULL,
                                                 NULL,
                                                 &pixelBuffer);
            //NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

            // calculate the time
            CMTime presentationTime;

                CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
                elapsedTime = thisFrameWallClockTime - (firstFrameWallClockTime+pausedFrameTime);
                // NSLog (@"elapsedTime: %f", elapsedTime);
                presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
                BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

                if (appended) {
                    CVPixelBufferRelease( pixelBuffer );
                    CFRelease(imageData);
                    pixelBuffer = nil;
                    //NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
                } else {
                    [self stopRecording];
                }



        }
    }


}

关于ios - 屏幕截图,包括带有叠加按钮的 AVCaptureVideoPreviewLayer,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19785745/

相关文章:

objective-c - 调用当前 Modal ViewController 导致 “EXC_BAD_ACCESS”

ios - 类扩展 vs 主接口(interface) vs 类别

iphone - UIToolbar 和其他 View

iphone - 如何调用在同一个 ViewController 中声明的简单方法?

iphone - scrollToTop 不适用于 iPhone 但它适用于 iPad

objective-c - 如何在使用 Cocoa NSDirectoryEnumerator 列出目录内容时显示文件或进度?

iphone - 将 UIImage 转换为 NSData 但保留为 GIF 文件

android - 如何让我的按钮在所有媒体设备上正确显示?

ios - UIButton 没有出现在 SKScene 中

iphone - scheduledTimerWithTimeInterval vs performselector with delay with iOS 5.0