iOS 截图/App 录制技巧

标签 ios screenshot capture

我有一个视觉上相当复杂的应用程序,它有一个基本的 UIViewController 和几个 UIViews(由我子类化和扩展)。我会定期抛出 UIAlertViews 和 UIPopOverControllers。

我正在努力开发一种视频录制解决方案,以便在用户使用该应用程序时,它会记录正在发生的事情以供日后汇报。

我有一个部分有效的解决方案,但它非常慢(每秒不能超过 1 帧),有一些问题(图像当前旋转和倾斜,但我想我可以解决这个问题)这不是我认为的理想解决方案。

我跳出了那个思路,开始实现一个使用 UIGetImageFromCurrentImageContext() 的解决方案,但它一直给我 nil 图像,即使是从 drawRect: 中调用时也是如此>.

不过,我突然想到,我不想一直调用 drawRect: 只是为了截屏!我不想实际启动任何额外的绘图,只是捕捉屏幕上的内容。

我很高兴发布我正在使用的代码,但它还没有真正起作用。有谁知道一个好的解决方案来做我正在寻找的事情?

我找到的一个解决方案并不完全适合我,因为它似乎从未捕获 UIAlertViews 和其他重叠 View 。

有什么帮助吗?

谢谢!

颠簸

最佳答案

我无法进行全尺寸实时视频编码。但是,作为替代方案,请考虑这一点。

不记录帧,而是记录发生的 Action (带有时间戳)。然后,当你想回放时,只需重放 Action 即可。您已经有了代码,因为您在“现实生活”中执行了它。

您所做的就是及时重播这些相同的 Action 。

编辑

如果你想尝试录音,这就是我所做的(注意,我放弃了它......这是一个正在进行的实验,所以就拿它作为我如何接近它的例子......没有什么是生产 -准备好)。我能够以 640x360 的分辨率录制实时音频/视频,但分辨率对我来说太低了。在 iPad 上看起来不错,但当我将视频移到我的 Mac 上观看时就很糟糕了。

我在使用更高的分辨率时遇到了问题。我从 RosyWriter 示例项目中改编了大部分代码。以下是设置 Assets 编写器、开始录制以及将 UIImage 添加到视频流的主要例程。

祝你好运。

CGSize const VIDEO_SIZE = { 640, 360 };

- (void) startRecording
{
    dispatch_async(movieWritingQueue, ^{
        NSLog(@"startRecording called in state 0x%04x", state);
        if (state != STATE_IDLE) return;
        state = STATE_STARTING_RECORDING;
        NSLog(@"startRecording changed state to 0x%04x", state);

        NSError *error;
        //assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeQuickTimeMovie error:&error];
        assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeMPEG4 error:&error];
        if (error) {
            [self showError:error];
        }
        [self removeFile:movieURL];
        [self resumeCaptureSession];
        [self.delegate recordingWillStart];
    }); 
}


// TODO: this is where we write an image into the movie stream...
- (void) writeImage:(UIImage*)inImage
{
    static CFTimeInterval const minInterval = 1.0 / 10.0;

    static CFAbsoluteTime lastFrameWrittenWallClockTime;
    CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    CFTimeInterval timeBetweenFrames = thisFrameWallClockTime - lastFrameWrittenWallClockTime;
    if (timeBetweenFrames < minInterval) return;

    // Not really accurate, but we just want to limit the rate we try to write frames...
    lastFrameWrittenWallClockTime = thisFrameWallClockTime;

    dispatch_async(movieWritingQueue, ^{
        if ( !assetWriter ) return;

        if ((state & STATE_STARTING_RECORDING) && !(state & STATE_MASK_VIDEO_READY)) {
            if ([self setupAssetWriterImageInput:inImage]) {
                [self videoIsReady];
            }
        }
        if (state != STATE_RECORDING) return;
        if (assetWriter.status != AVAssetWriterStatusWriting) return;

        CGImageRef cgImage = CGImageCreateCopy([inImage CGImage]);
        if (assetWriterVideoIn.readyForMoreMediaData) {
            CVPixelBufferRef pixelBuffer = NULL;

            // Resize the original image...
            if (!CGSizeEqualToSize(inImage.size, VIDEO_SIZE)) {
                // Build a context that's the same dimensions as the new size
                CGRect newRect = CGRectIntegral(CGRectMake(0, 0, VIDEO_SIZE.width, VIDEO_SIZE.height));
                CGContextRef bitmap = CGBitmapContextCreate(NULL,
                                                            newRect.size.width,
                                                            newRect.size.height,
                                                            CGImageGetBitsPerComponent(cgImage),
                                                            0,
                                                            CGImageGetColorSpace(cgImage),
                                                            CGImageGetBitmapInfo(cgImage));

                // Rotate and/or flip the image if required by its orientation
                //CGContextConcatCTM(bitmap, transform);

                // Set the quality level to use when rescaling
                CGContextSetInterpolationQuality(bitmap, kCGInterpolationHigh);

                // Draw into the context; this scales the image
                //CGContextDrawImage(bitmap, transpose ? transposedRect : newRect, imageRef);
                CGContextDrawImage(bitmap, newRect, cgImage);

                // Get the resized image from the context and a UIImage
                CGImageRef newImageRef = CGBitmapContextCreateImage(bitmap);
                CGContextRelease(bitmap);
                CGImageRelease(cgImage);
                cgImage = newImageRef;
            }

            CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

            int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, self.assetWriterPixelBufferAdaptor.pixelBufferPool, &pixelBuffer);
            if(status != 0){
                //could not get a buffer from the pool
                NSLog(@"Error creating pixel buffer:  status=%d", status);
            }
            // set image data into pixel buffer
            CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
            uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);

            // Danger, Will Robinson!!!!!  USE_BLOCK_IN_FRAME warning...
            CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);

            if(status == 0){
                //CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
                CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
                CMTime presentationTime = CMTimeAdd(firstBufferTimeStamp, CMTimeMake(elapsedTime * TIME_SCALE, TIME_SCALE));
                BOOL success = [self.assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];


                if (!success)
                    NSLog(@"Warning:  Unable to write buffer to video");
            }

            //clean up
            CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
            CVPixelBufferRelease( pixelBuffer );
            CFRelease(image);
            CGImageRelease(cgImage);
        } else {
            NSLog(@"Not ready for video data");
        }
    });
}


-(BOOL) setupAssetWriterImageInput:(UIImage*)image
{
    NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
                                           [NSNumber numberWithDouble:1024.0*1024.0], AVVideoAverageBitRateKey,
                                           nil ];

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   //[NSNumber numberWithInt:image.size.width], AVVideoWidthKey,
                                   //[NSNumber numberWithInt:image.size.height], AVVideoHeightKey,
                                   [NSNumber numberWithInt:VIDEO_SIZE.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:VIDEO_SIZE.height], AVVideoHeightKey,

                                   videoCompressionProps, AVVideoCompressionPropertiesKey,
                                   nil];
    NSLog(@"videoSettings: %@", videoSettings);

    assetWriterVideoIn = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    NSParameterAssert(assetWriterVideoIn);
    assetWriterVideoIn.expectsMediaDataInRealTime = YES;
    NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
                                      [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

    self.assetWriterPixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoIn sourcePixelBufferAttributes:bufferAttributes];

    //add input
    if ([assetWriter canAddInput:assetWriterVideoIn]) {
        [assetWriter addInput:assetWriterVideoIn];
    }
    else {
        NSLog(@"Couldn't add asset writer video input.");
        return NO;
    }

    return YES;
}

关于iOS 截图/App 录制技巧,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9823242/

相关文章:

python - pyshark无法在windows 7上抓包(python)

C++11 lambda : when do we need to capture [*this] instead of [this]?

video - Directshow,捕获两个USB网络摄像头,无法同时启动

ios - 启动后立即添加 View 时出现 "Application windows are expected to have a root view controller"消息,仅限 iOS 9

ios - 使用 AVAssetImageGenerator 的 copyCGImageAtTime 为 Assets 的特定帧抛出 -11800 未知错误

android-studio - 在 Android Studio 中创建可以在需要时截取屏幕截图的 UI 测试的最简单方法是什么?

android - 检测屏幕截图 Android

ios - xcodebuild 命令在 codesign 上失败,但日志显示使用了不正确的配置文件 uuid

ios - 音乐未播放 - XCode

objective-c - 如何从 NSURLRequest 获取完整的请求?