ios - 获取录制视频原始帧

标签 ios image-processing video-processing ca

我是 Objective-C 和 iOS 技术的新手。我想通过代码录制视频,并且在运行时,我必须将每一帧作为原始数据进行某些处理。我怎样才能达到这个目的吗?请任何人帮助我。提前致谢。这是到目前为止我的代码:

- (void)viewDidLoad
{
    [super viewDidLoad];

    [self setupCaptureSession];

}

viewDidAppear 函数

-(void)viewDidAppear:(BOOL)animated
{
    if (!_bpickeropen)
    {
       _bpickeropen = true;
        _picker = [[UIImagePickerController alloc] init];
        _picker.delegate = self;
        NSArray *sourceTypes = [UIImagePickerController availableMediaTypesForSourceType:picker.sourceType];
        if (![sourceTypes containsObject:(NSString *)kUTTypeMovie ])
        {
            NSLog(@"device not supported");
            return;
        }

        _picker.sourceType = UIImagePickerControllerSourceTypeCamera;
        _picker.mediaTypes = [NSArray arrayWithObjects:(NSString *)kUTTypeMovie,nil];//,(NSString *) kUTTypeImage
        _picker.videoQuality = UIImagePickerControllerQualityTypeHigh;
        [self presentModalViewController:_picker animated:YES];
    }



}

//写入示例缓冲区时调用的委托(delegate)例程

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);


    CVPixelBufferLockBaseAddress(cameraFrame, 0);

    GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);

    **NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];

**

问题 1.(这里我只获取一次原始字节) 2.(之后我想将此原始字节存储为应用程序路径中的二进制文件)。

// Do whatever with your bytes

NSLog(@"bytes per row %zd",bytesPerRow);

[dataForRawBytes writeToFile:[self datafilepath]atomically:YES];

NSLog(@"Sample Buffer Data is %@\n",dataForRawBytes);


CVPixelBufferUnlockBaseAddress(cameraFrame, 0);

}

这里我设置输出的委托(delegate)//创建并配置捕获 session 并启动它运行 - (无效)setupCaptureSession { NSError *错误 = nil;

// Create the session
AVCaptureSession *session = [[AVCaptureSession alloc] init];


// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetMedium;

// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
                           defaultDeviceWithMediaType:AVMediaTypeVideo];

// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                    error:&error];
if (!input)
{
    // Handling the error appropriately.
}
[session addInput:input];

// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];


// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
 [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                            forKey:(id)kCVPixelBufferPixelFormatTypeKey]; //kCVPixelBufferPixelFormatTypeKey


// If you wish to cap the frame rate to a known value, such as 15 fps, set
// minFrameDuration.
// output.minFrameDuration = CMTimeMake(1, 15);

// Start the session running to start the flow of data
[session startRunning];

// Assign session to an ivar.
//[self setSession:session];

}

非常感谢您的帮助。提前致谢。

最佳答案

您可以查看 AVFoundation框架。它允许您访问相机生成的原始数据。

这个link是一个很好的 AVFoundation 摄像机使用入门级项目。

为了从视频输出中获取各个帧,您可以使用 AVCaptureVideoDataOutput AVFoundation 框架中的类。

希望这有帮助。

编辑:您可以查看 AVCaptureVideoDataOutputSampleBufferDelegate 的委托(delegate)函数,特别是captureOutput:didOutputSampleBuffer:fromConnection:方法。每次捕获新帧时都会调用此函数。

如果您不知道代表如何工作,请参阅此 link是代表的一个很好的例子。

关于ios - 获取录制视频原始帧,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22601884/

相关文章:

ios - 需要有关 NSURLConnection 的帮助

java - Android:图像的大小会影响SURF的处理时间

当 Mat 的类型为 CV_32S 时,OpenCV MedianBlur 函数崩溃

python - 为什么我不能对这个二进制图像进行 Blob 检测

video - ffmpeg 覆盖未显示在第一帧

python - 如何在opencv python中获取视频的前一帧

ffmpeg - 哪些过滤器会影响 ffmpeg 编码速度

ios - 如何打印一些东西,如果它是 nil,它将打印 nil - iOS

ios - 使用默认摄像头启用视频模式

iphone - mp3 文件的 http 直播