objective-c - 使用 CGDisplayStream 编码 H.264 压缩 session

标签 objective-c core-graphics h.264 video-toolbox iosurface

我正在尝试使用屏幕上的数据创建一个 H.264 压缩 session 。我创建了一个 CGDisplayStreamRef 实例,如下所示:

displayStream = CGDisplayStreamCreateWithDispatchQueue(0, 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
    //Call encoding session here
});

下面是我目前如何设置编码功能:

- (void) encode:(CMSampleBufferRef )sampleBuffer {
    CVImageBufferRef imageBuffer = (CVImageBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    CMTime presentationTimeStamp = CMTimeMake(frameID++, 1000);
    VTEncodeInfoFlags flags;
    OSStatus statusCode = VTCompressionSessionEncodeFrame(EncodingSession,
                                                          imageBuffer,
                                                          presentationTimeStamp,
                                                          kCMTimeInvalid,
                                                          NULL, NULL, &flags);
    if (statusCode != noErr) {
        NSLog(@"H264: VTCompressionSessionEncodeFrame failed with %d", (int)statusCode);

        VTCompressionSessionInvalidate(EncodingSession);
        CFRelease(EncodingSession);
        EncodingSession = NULL;
        return;
    }
    NSLog(@"H264: VTCompressionSessionEncodeFrame Success");
}

我正在尝试了解如何将屏幕上的数据转换为 CMSampleBufferRef,以便正确调用我的编码函数。到目前为止,我还无法确定这是否可行,或者我正在尝试做的事情的正确方法。有人有什么建议吗?

编辑:我已经将我的 IOSurface 转换为 CMBlockBuffer,但还没有想出如何将其转换为CMSampleBufferRef:

void *mem = IOSurfaceGetBaseAddress(frameSurface);
size_t bytesPerRow = IOSurfaceGetBytesPerRow(frameSurface);
size_t height = IOSurfaceGetHeight(frameSurface);
size_t totalBytes = bytesPerRow * height;

CMBlockBufferRef blockBuffer;

CMBlockBufferCreateWithMemoryBlock(kCFAllocatorNull, mem, totalBytes, kCFAllocatorNull, NULL, 0, totalBytes, 0, &blockBuffer);

编辑 2

更多进展:

CMSampleBufferRef *sampleBuffer;

OSStatus sampleStatus = CMSampleBufferCreate(
                             NULL, blockBuffer, TRUE, NULL, NULL,
                             NULL, 1, 1, NULL,
                             0, NULL, sampleBuffer);

[self encode:*sampleBuffer];

最佳答案

可能,我来晚了一点,但它可能对其他人有帮助:

CGDisplayStreamCreateWithDispatchQueue(CGMainDisplayID(), 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
    // The created pixel buffer retains the surface object.
    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreateWithIOSurface(NULL, frameSurface, NULL, &pixelBuffer);

    // Create the video-type-specific description for the pixel buffer.
    CMVideoFormatDescriptionRef videoFormatDescription;
    CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoFormatDescription);

    // All the necessary parts for creating a `CMSampleBuffer` are ready.
    CMSampleBufferRef sampleBuffer;
    CMSampleTimingInfo timingInfo;
    CMSampleBufferCreateReadyWithImageBuffer(NULL, pixelBuffer, videoFormatDescription, &timingInfo, &sampleBuffer);

    // Do the stuff

    // Release the resources to let the frame surface be reused in the queue
    // `kCGDisplayStreamQueueDepth` is responsible for the size of the queue 
    CFRelease(sampleBuffer);
    CFRelease(pixelBuffer);
});

关于objective-c - 使用 CGDisplayStream 编码 H.264 压缩 session ,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42721360/

相关文章:

iphone - setImage 方法更改 UIBarButtonItem 宽度,即使图像大小没有改变

iphone - 维护不同 View 将使用的网络连接的正确方法是什么?

ios - CGAffineTransformIdentity 在多次转换后不重置 UIImageView?

android - How to encode jpeg images to H264 very fast(将图像转换为视频)

android - 如何使用 MediaCodec 将位图编码为视频?

iphone - 这里如何获取和修改一个像素值?

objective-c - 从沙盒应用程序中“杀死”应用程序?

ios - 自界混淆

cocoa - 如何迭代 UIImage 的所有像素?

video - 如何估计特定分辨率下视频的文件大小?