ios - 如何从 ARSKView(或 SKView)帧创建视频,包括微音频(ios 11)

标签 ios swift sprite-kit arkit

我正在玩 ARKit,我想从 ARSKView 帧创建一个视频。 我尝试使用 ReplayKit,但行为不是我所期望的: - 我不想录制整个屏幕。 - 我不希望提示用户我们正在录制屏幕。

另外,我怎样才能结合微输入和视频?正如我猜想音频没有在 ARSKView 中流式传输? 这是代码(来自 Apple 示例):

import UIKit
import SpriteKit
import ARKit

class ViewController: UIViewController, ARSKViewDelegate {

    @IBOutlet var sceneView: ARSKView!

    override func viewDidLoad() {
        super.viewDidLoad()
        // Set the view's delegate
        sceneView.delegate = self

        // Show statistics such as fps and node count
        sceneView.showsFPS = true
        sceneView.showsNodeCount = true

        // Load the SKScene from 'Scene.sks'
        if let scene = SKScene(fileNamed: "Scene") {
            sceneView.presentScene(scene)
        }
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        // Create a session configuration
        let configuration = ARWorldTrackingSessionConfiguration()

        // Run the view's session
        sceneView.session.run(configuration)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        // Pause the view's session
        sceneView.session.pause()
    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Release any cached data, images, etc that aren't in use.
    }

    // MARK: - ARSKViewDelegate

    func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
        // Create and configure a node for the anchor added to the view's session.
        let labelNode = SKLabelNode(text: "👾")
        labelNode.horizontalAlignmentMode = .center
        labelNode.verticalAlignmentMode = .center
        return labelNode;
    }

    func session(_ session: ARSession, didFailWithError error: Error) {
        // Present an error message to the user

    }

    func sessionWasInterrupted(_ session: ARSession) {
        // Inform the user that the session has been interrupted, for example, by presenting an overlay

    }

    func sessionInterruptionEnded(_ session: ARSession) {
        // Reset tracking and/or remove existing anchors if consistent tracking is required

    }}

如果有必要,场景类:

import SpriteKit
import ARKit
class Scene: SKScene {

    override func didMove(to view: SKView) {
        // Setup your scene here
    }

    override func update(_ currentTime: TimeInterval) {
        // Called before each frame is rendered
    }

    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        guard let sceneView = self.view as? ARSKView else {
            return
        }

        // Create anchor using the camera's current position
        if let currentFrame = sceneView.session.currentFrame {

            // Create a transform with a translation of 0.2 meters in front of the camera
            var translation = matrix_identity_float4x4
            translation.columns.3.z = -0.2
            let transform = simd_mul(currentFrame.camera.transform, translation)

            // Add a new anchor to the session
            let anchor = ARAnchor(transform: transform)
            sceneView.session.add(anchor: anchor)
        }
    }
}

最佳答案

如果您只需要记录帧(如使用 AVCaptureSession,而不是使用 SCNNodes 的“真实”3D 场景),只需将它们作为 ARFrame.capturedImage updateAtTimeSCNSceneRenderer 的委托(delegate)函数:

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
    createMovieWriterOnce(frame: session.currentFrame)
    appendFrameWithMetadaToMovie(frame: session.currentFrame)
}   

我还没有找到从 ARSession 获取帧大小的方法,所以 MovieWriter 等待第一帧设置大小:

func createMovieWriterOnce(frame: ARFrame?) {
    if(frame == nil) { return }
    DispatchQueue.once(token: "SimplestMovieWriter.constructor") {
        movieWriter = SimplestMovieWriter(frameWidth: CVPixelBufferGetWidth(frame!.capturedImage), frameHeight: CVPixelBufferGetHeight(frame!.capturedImage))
    }
} 

然后每个下一个 CVPixelBuffer 被馈送到 MovieWriter:

func appendFrameWithMetadaToMovie(frame: ARFrame?) {
    if(!isVideoRecording || frame == nil) { return }
    let interestingPoints = frame?.rawFeaturePoints?.points
    movieWriter.appendBuffer(buffer: (frame?.capturedImage)!, withMetadata: interestingPoints)
}

MovieWriter 是带有 AVAssetWriterAVAssetWriterInputAVAssetWriterInputPixelBufferAdaptor 的自定义类。

您可以保存没有音频的视频,然后使用 AVAssetExportSession 添加您想要的任何内容(音频、字幕、元数据):

let composition = AVMutableComposition()
...
let trackVideo = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoFileAsset = AVURLAsset(url: currentURL!, options: nil)
let videoFileAssetTrack = videoFileAsset.tracks(withMediaType: AVMediaTypeVideo)[0]

// add audio track here    

关于ios - 如何从 ARSKView(或 SKView)帧创建视频,包括微音频(ios 11),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44704360/

相关文章:

ios - 如何解决计算中使用区域设置小数点/逗号的 iOS 错误? (与 NumberFormatter)

android - 从 Android 调用 Google Cloud Endpoints API 时出现错误 413(请求实体太大)

ios - 获取所有包含 TextFields 的 TableView Cell 的数据(包括隐藏 View )

ios - 如果我同时有开始日期和结束日期文本字段,我是否必须创建 2 个不同的 DatePickers?

swift - 方法与父类(super class)中的方法冲突 - 重写会出错

ios - 使用 panGestureRecognizer 进行滑出 View 时获取 "unexpectedly found nil while unwrapping an Optional value"

ios - UISplitViewController 扩展给应用带来了麻烦

ios - 转到新场景后背景音乐暂停 Swift 3 SpriteKit

Swift 2,SpriteKit,我在从父节点删除节点时遇到问题

c++ - 阵列已删除或损坏