正在寻求一些有关将此 Objective-C 类方法移植到 JS/nativescript 的帮助。我尝试过的每个变体都导致 TypeError: undefined is not a function...
https://developer.apple.com/documentation/avfoundation/avvideocomposition/1389556-init
我尝试用 JS 编写如下:
const videoComp = AVVideoComposition.alloc().initWithAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });
//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandler(asset, (request) => { ... });
//OR
const videoComp = AVVideoComposition.alloc().initAssetApplyingCIFiltersWithHandlerApplier(asset, (request) => { ... });
//OR
const videoComp = new AVVideoComposition(asset, (request) => { ... });
仅举几例。本质上,我试图将此代码移植到 nativescript/JS:
let blurRadius = 6.0
let asset = AVAsset(url: streamURL)
let item = AVPlayerItem(asset: asset)
item.videoComposition= AVVideoComposition(asset: asset) { request in
let blurred = request.sourceImage.clampedToExtent().applyingGaussianBlur(sigma: blurRadius)
let output = blurred.clampedToRect(request.sourceImage.extent)
request.finish(with: output, context: nil)
}
在此博文中找到:https://willowtreeapps.com/ideas/how-to-apply-a-filter-to-a-video-stream-in-ios
最佳答案
使用 JavaScript/Typescript 时,它应该看起来像这样,
let blurRadius = 6.0;
let asset = AVAsset.assetWithURL(streamURL);
let item = AVPlayerItem.alloc().initWithAsset(asset);
item.videoComposition = AVVideoComposition.videoCompositionWithAssetApplyingCIFiltersWithHandler(asset, request => {
let blurred = request.sourceImage.imageByClampingToExtent().imageByApplyingGaussianBlurWithSigma(blurRadius);
let output = blurred.imageByClampingToRect(request.sourceImage.extent);
request.finishWithImageContext(output, null);
});
注意:该代码未经测试,只是给定 native 代码的翻译。利用tns-platform-declarations用于 IntelliSense 支持。
关于javascript - 将 AVVideoComposition 初始值设定项转换为 Nativescript,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58082795/