ios - 如何使用 iOS sdk 录制设备屏幕视频?

标签 ios ios-simulator

我正在做一个教程应用程序,我必须在其中拍摄 iOS 屏幕视频。为了查看它工作正常但我们的要求是在视频运行时录制屏幕视频。我试过了使用 UIWebviewMPMoviePlayer 播放视频。在播放器启动之前它正在录制。但是在启动播放器后我只得到黑屏。任何建议。

我点击了这个链接:

http://developer.apple.com/library/ios/#qa/qa1703/_index.html

 -(void) startRecording {


    // create the AVAssetWriter

    NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];

    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath]) {

    [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];

    }


    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];

    NSError *movieError = nil;

    [assetWriter release];

    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL

                                                fileType: AVFileTypeQuickTimeMovie

                                                   error: &movieError];

    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:

      AVVideoCodecH264, AVVideoCodecKey,

      [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,

      [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,

      nil];

    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo

      outputSettings:assetWriterInputSettings];

    assetWriterInput.expectsMediaDataInRealTime = YES;

    [assetWriter addInput:assetWriterInput];


    [assetWriterPixelBufferAdaptor release];

    assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor  alloc]

    initWithAssetWriterInput:assetWriterInput

    sourcePixelBufferAttributes:nil];

    [assetWriter startWriting];


    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();

    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];


    // start writing samples to it

    [assetWriterTimer release];

    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1

    target:self

      selector:@selector (writeSample:)

      userInfo:nil

      repeats:YES] ;


    }



    -(void) stopRecording {

    [assetWriterTimer invalidate];

    assetWriterTimer = nil;


    [assetWriter finishWriting];

    NSLog (@"finished writing");

    }

- (UIImage*)screenshot

{

    // Create a graphics context with the target size

    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration

    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext

    CGSize imageSize = [[UIScreen mainScreen] bounds].size;

CGFloat imageScale = imageSize.width / FRAME_WIDTH;

    if (NULL != UIGraphicsBeginImageContextWithOptions)

        UIGraphicsBeginImageContextWithOptions(imageSize, NO, imageScale);

    else

        UIGraphicsBeginImageContext(imageSize);


    CGContextRef context = UIGraphicsGetCurrentContext();


    // Iterate over every window from back to front

    for (UIWindow *window in [[UIApplication sharedApplication] windows])

    {

        if (![window respondsToSelector:@selector(screen)] || [window screen] == [UIScreen mainScreen])

        {

            // -renderInContext: renders in the coordinate space of the layer,

            // so we must first apply the layer's geometry to the graphics context

            CGContextSaveGState(context);

            // Center the context around the window's anchor point

            CGContextTranslateCTM(context, [window center].x, [window center].y);

            // Apply the window's transform about the anchor point

            CGContextConcatCTM(context, [window transform]);

            // Offset by the portion of the bounds left of and above the anchor point

            CGContextTranslateCTM(context,

                                  -[window bounds].size.width * [[window layer] anchorPoint].x,

                                  -[window bounds].size.height * [[window layer] anchorPoint].y);


            // Render the layer hierarchy to the current context

            [[window layer] renderInContext:context];


            // Restore the context

            CGContextRestoreGState(context);

        }

    }


    // Retrieve the screenshot image

    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();


    UIGraphicsEndImageContext();


    return image;

}

-(void) writeSample: (NSTimer*) _timer {

if (assetWriterInput.readyForMoreMediaData) {

// CMSampleBufferRef sample = nil;



CVReturn cvErr = kCVReturnSuccess;



// get screenshot image!

CGImageRef image = (CGImageRef) [[self screenshot] CGImage];

NSLog (@"made screenshot");


// prepare the pixel buffer

CVPixelBufferRef pixelBuffer = NULL;

CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));

NSLog (@"copied image data");

cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,

FRAME_WIDTH,

FRAME_HEIGHT,

kCVPixelFormatType_32BGRA,

(void*)CFDataGetBytePtr(imageData),

CGImageGetBytesPerRow(image),

NULL,

NULL,

NULL,

&pixelBuffer);

NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);



// calculate the time

CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();

CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;

NSLog (@"elapsedTime: %f", elapsedTime);

CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);


// write the sample

BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];



if (appended) {

NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));

} else {

NSLog (@"failed to append");

[self stopRecording];

self.startStopButton.selected = NO;

}

}

}

地点:

NSTimer *clockTimer;
NSTimer *assetWriterTimer;
AVMutableComposition *mutableComposition;
AVAssetWriter *assetWriter;
AVAssetWriterInput *assetWriterInput;
AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferAdaptor;
CFAbsoluteTime firstFrameWallClockTime;

最佳答案

我从 iOS Developer Library 获得了一个示例应用程序。 AVSimpleEditor.除了视频录制,它还可以修剪、旋转、裁剪和添加音乐。

描述:

AVSimpleEditor 是一个简单的基于 AVFoundation 的电影编辑应用程序,它使用 AVVideoComposition、AVAudioMix 的 API 并演示如何将它们用于简单的视频编辑任务。它还演示了它们如何与播放 (AVPlayerItem) 和导出 (AVAssetExportSession) 交互。该应用程序执行修剪、旋转、裁剪、添加音乐、添加水印和导出。

关于ios - 如何使用 iOS sdk 录制设备屏幕视频?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/18285557/

相关文章:

ios - 应用程序在模拟器中交替崩溃

ios - Xcode 从 iOS 模拟器中的 App 恢复代码

iphone - iPhone 自动化测试

ios - MFMessageComposeViewController 外观 iOS 7

objective-c - 如何检查当前是否正在显示 UIViewController?

c# - MonoTouch 异常 : Selector invoked from objective-c on a managed object of type that has been GC'ed

objective-c - 通过代码将标题/图标等设置为在 IB 中创建的 TabBarItem?

ios - iPad 模拟器 5.1 在 Xcode 4.3 中显示空白屏幕

ios - 如何使用 SwiftUI 在 NavigationView 中正确包含 "Add Item"按钮?

iphone - 从 UIViewController 到 Appdelegate 的导航