我正在使用这个 Swift 类(最初显示在这个问题的答案中: Capture Metal MTKView as Movie in realtime? ) 尝试将我的 Metal 应用程序框架记录到电影文件中。
class MetalVideoRecorder {
var isRecording = false
var recordingStartTime = TimeInterval(0)
private var assetWriter: AVAssetWriter
private var assetWriterVideoInput: AVAssetWriterInput
private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor
init?(outputURL url: URL, size: CGSize) {
do {
assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V)
} catch {
return nil
}
let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : size.width,
AVVideoHeightKey : size.height ]
assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
assetWriterVideoInput.expectsMediaDataInRealTime = true
let sourcePixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey as String : size.width,
kCVPixelBufferHeightKey as String : size.height ]
assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
sourcePixelBufferAttributes: sourcePixelBufferAttributes)
assetWriter.add(assetWriterVideoInput)
}
func startRecording() {
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
recordingStartTime = CACurrentMediaTime()
isRecording = true
}
func endRecording(_ completionHandler: @escaping () -> ()) {
isRecording = false
assetWriterVideoInput.markAsFinished()
assetWriter.finishWriting(completionHandler: completionHandler)
}
func writeFrame(forTexture texture: MTLTexture) {
if !isRecording {
return
}
while !assetWriterVideoInput.isReadyForMoreMediaData {}
guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
return
}
var maybePixelBuffer: CVPixelBuffer? = nil
let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
if status != kCVReturnSuccess {
print("Could not get pixel buffer from asset writer input; dropping frame...")
return
}
guard let pixelBuffer = maybePixelBuffer else { return }
CVPixelBufferLockBaseAddress(pixelBuffer, [])
let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!
// Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let region = MTLRegionMake2D(0, 0, texture.width, texture.height)
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
let frameTime = CACurrentMediaTime() - recordingStartTime
let presentationTime = CMTimeMakeWithSeconds(frameTime, 240)
assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
}
}
我没有看到任何错误,但生成的 Quicktime 文件中的帧都是黑色的。帧大小正确,我的像素格式正确 (bgra8Unorm)。有人知道为什么它可能不起作用吗?
我在呈现和提交当前可绘制对象之前调用 writeFrame 函数,如下所示:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
最初我确实遇到了一个错误,我的 MetalKitView 层是“framebufferOnly”。所以我在尝试记录之前将其设置为 false。这摆脱了错误,但框架都是黑色的。我也尝试在程序的最开始将它设置为 false,但我得到了相同的结果。
我也尝试使用“addCompletedHandler”而不是“addScheduledHandler”,但这给了我错误“[CAMetalLayerDrawable 纹理] 不应在已经呈现此可绘制对象后调用。改为获取 nextDrawable。 ".
感谢您的任何建议!
编辑:我在@Idogy 的帮助下解决了这个问题。测试显示原始版本适用于 iOS,但不适用于 Mac。他说因为我有 NVIDIA GPU,所以帧缓冲区是私有(private)的。所以我不得不添加一个 blitCommandEncoder 对纹理进行同步调用,然后它开始工作。像这样:
if let drawable = view.currentDrawable {
if BigVideoWriter != nil && BigVideoWriter!.isRecording {
#if ISMAC
if let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() {
blitCommandEncoder.synchronize(resource: drawable.texture)
blitCommandEncoder.endEncoding()
}
#endif
commandBuffer.addCompletedHandler { commandBuffer in
BigVideoWriter?.writeFrame(forTexture: drawable.texture)
}
}
commandBuffer.present(drawable)
commandBuffer.commit()
}
最佳答案
我认为您写帧的时间太早了——通过在渲染循环中调用 writeFrame
,您实际上是在它仍然为空时捕获可绘制对象(GPU 刚刚尚未渲染它)。
请记住,在您调用 commmandBuffer.commit()
之前,GPU 甚至还没有开始 渲染您的帧。在尝试抓取生成的帧之前,您需要等待 GPU 完成渲染。该顺序有点困惑,因为您还在调用 commit()
之前调用了 present()
,但这不是运行时的实际操作顺序。 present
调用只是告诉 Metal 安排一个调用,以便在 GPU 完成渲染后将您的帧呈现给屏幕。
您应该从完成处理程序中调用writeFrame
(使用commandBuffer.addCompletedHandler()
)。这应该可以解决这个问题。
更新:虽然上面的答案是正确的,但只是部分答案。由于 OP 使用带有私有(private) VRAM 的独立 GPU,因此 CPU 无法看到渲染目标像素。该问题的解决方案是添加一个 MTLBlitCommandEncoder
,并使用 synchronize()
方法确保渲染像素从 GPU 的 VRAM 复制回 RAM。
关于swift - 尝试使用 AVFoundation AVAssetWriter 将 Metal 帧写入 Quicktime 文件时出现全黑帧,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50633899/