arkit - 如何使用 ARMatteGenerator 将 CIFilter 添加到 MTLTexture?

标签 arkit metal cifilter

我正在使用 Apple 的 sample project与使用 ARMatteGenerator 相关生成一个 MTLTexture,可以用作人遮挡技术中的遮挡 mask 。

我想确定如何通过 CIFilter 运行生成的 mask 。在我的代码中,我像这样“过滤” mask ;

func updateMatteTextures(commandBuffer: MTLCommandBuffer) {
    guard let currentFrame = session.currentFrame else {
        return
    }
    var targetImage: CIImage?
    alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
    dilatedDepthTexture = matteGenerator.generateDilatedDepth(from: currentFrame, commandBuffer: commandBuffer)
    targetImage = CIImage(mtlTexture: alphaTexture!, options: nil)
    monoAlphaCIFilter?.setValue(targetImage!, forKey: kCIInputImageKey)
    monoAlphaCIFilter?.setValue(CIColor.red, forKey: kCIInputColorKey)
    targetImage = (monoAlphaCIFilter?.outputImage)!
    let drawingBounds = CGRect(origin: .zero, size: CGSize(width: alphaTexture!.width, height: alphaTexture!.height))
    context.render(targetImage!, to: alphaTexture!, commandBuffer: commandBuffer, bounds: drawingBounds, colorSpace: CGColorSpaceCreateDeviceRGB())

}

当我合成磨砂纹理和背景时,没有应用到磨砂的过滤效果。这就是纹理的合成方式;
func compositeImagesWithEncoder(renderEncoder: MTLRenderCommandEncoder) {
    guard let textureY = capturedImageTextureY, let textureCbCr = capturedImageTextureCbCr else {
        return
    }

    // Push a debug group allowing us to identify render commands in the GPU Frame Capture tool
    renderEncoder.pushDebugGroup("CompositePass")

    // Set render command encoder state
    renderEncoder.setCullMode(.none)
    renderEncoder.setRenderPipelineState(compositePipelineState)
    renderEncoder.setDepthStencilState(compositeDepthState)

    // Setup plane vertex buffers
    renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
    renderEncoder.setVertexBuffer(scenePlaneVertexBuffer, offset: 0, index: 1)

    // Setup textures for the composite fragment shader
    renderEncoder.setFragmentBuffer(sharedUniformBuffer, offset: sharedUniformBufferOffset, index: Int(kBufferIndexSharedUniforms.rawValue))
    renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureY), index: 0)
    renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureCbCr), index: 1)
    renderEncoder.setFragmentTexture(sceneColorTexture, index: 2)
    renderEncoder.setFragmentTexture(sceneDepthTexture, index: 3)
    renderEncoder.setFragmentTexture(alphaTexture, index: 4)
    renderEncoder.setFragmentTexture(dilatedDepthTexture, index: 5)

    // Draw final quad to display
    renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
    renderEncoder.popDebugGroup()
}

如何将 CIFilter 仅应用于 ARMatteGenerator 生成的 alphaTexture?

最佳答案

我认为您不想申请 CIFilteralphaTexture .我假设您使用的是 Apple 的 Effecting People Occlusion in Custom Renderers示例代码。如果你看今年的Bringing People into AR在 WWDC session 上,他们讨论了使用 ARMatteGenerator 生成分段 mask ,这就是 alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer) 正在做的事情. alphaTextureMTLTexture这本质上是一个 alpha 掩码,用于在相机框架中检测到人类(即完全不透明的人类和完全透明的人类)。

Apple documentation

向 alpha 纹理添加过滤器不会过滤最终渲染的图像,而只会影响合成中使用的蒙版。如果您正在尝试实现 your previous question 中链接的视频,我建议调整发生合成的 Metal 着色器。在 session 中,他们指出他们比较了dilatedDepthrenderedDepth看看他们是否应该从相机中绘制虚拟内容或像素:

fragment half4 customComposition(...) {
    half4 camera = cameraTexture.sample(s, in.uv);
    half4 rendered = renderedTexture.sample(s, in.uv);
    float renderedDepth = renderedDepthTexture.sample(s, in.uv);
    half4 scene = mix(rendered, camera, rendered.a);
    half matte = matteTexture.sample(s, in.uv);
    float dilatedDepth = dilatedDepthTexture.sample(s, in.uv);

    if (dilatedDepth < renderedDepth) { // People in front of rendered
        // mix together the virtual content and camera feed based on the alpha provided by the matte
        return mix(scene, camera, matte);
    } else {
        // People are not in front so just return the scene
        return scene
    }
}

不幸的是,这在示例代码中略有不同,但它仍然很容易修改。开通 Shaders.metal .找到 compositeImageFragmentShader功能。在函数结束时,您会看到 half4 occluderResult = mix(sceneColor, cameraColor, alpha);这与 mix(scene, camera, matte); 的操作基本相同我们在上面看到的。我们正在根据分割 mask 决定是使用场景中的像素还是相机输入中的像素。我们可以通过替换 cameraColor 轻松地用任意 rgba 值替换相机图像像素。与 half4代表一种颜色。例如,我们可以使用 half4(float4(0.0, 0.0, 1.0, 1.0))将分割哑光内的所有像素绘制为蓝色:
…
// Replacing camera color with blue
half4 occluderResult = mix(sceneColor, half4(float4(0.0, 0.0, 1.0, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;

Screencast

当然,您也可以应用其他效果。动态灰度静态非常容易实现。

以上 compositeImageFragmentShader添加:
float random(float offset, float2 tex_coord, float time) {
    // pick two numbers that are unlikely to repeat
    float2 non_repeating = float2(12.9898 * time, 78.233 * time);

    // multiply our texture coordinates by the non-repeating numbers, then add them together
    float sum = dot(tex_coord, non_repeating);

    // calculate the sine of our sum to get a range between -1 and 1
    float sine = sin(sum);

    // multiply the sine by a big, non-repeating number so that even a small change will result in a big color jump
    float huge_number = sine * 43758.5453 * offset;

    // get just the numbers after the decimal point
    float fraction = fract(huge_number);

    // send the result back to the caller
    return fraction;
}

(取自@twostraws ShaderKit)

然后修改compositeImageFragmentShader到:
…
float randFloat = random(1.0, cameraTexCoord, rgb[0]);

half4 occluderResult = mix(sceneColor, half4(float4(randFloat, randFloat, randFloat, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;

你应该得到:

Static screencast

最后,调试器似乎很难跟上应用程序。对我来说,当运行附加的 Xcode 时,应用程序会在启动后不久卡住,但在单独运行时通常很流畅。

关于arkit - 如何使用 ARMatteGenerator 将 CIFilter 添加到 MTLTexture?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57228411/

相关文章:

objective-c - Metal - 如何从预定的处理程序回调中呈现可绘制对象

ios - Xamarin IOS 中的自定义 CIColorKernel

ios - CIPhotoEffect CIFilters 在颜色管理方面是不变的。是什么赋予了 CIPhotoEffect 滤镜这个属性?

swift - 如何不使深灰色变得透明,从图像中去除背景

ios - 二元运算符 '*' 不能应用于类型 'SCNVector3' 和 'Double' 的操作数

ios - 如何将 SCNPlane 颜色更改为清晰颜色

swift - 如何正确创建自定义空纹理?

ios - 下载为 PNG 时 UIImage 到 MTLTexture

ios - 在 ARKit 中使用物理时,是什么导致 SCNNode 不相互接触?

ios - 如何防止物体在场景包中转换阴影?