ios - GPUImageMovie 使用多个图像作为纹理和过程

标签 ios objective-c avplayer gpuimage

我正在尝试使用一些 GPUImagePicture 作为纹理源以及用于过滤播放视频的片段着色器。

我能够以这种方式很好地处理静止图像,但我似乎无法弄清楚我缺少什么才能让它在 GPUImageMovie 上工作 我将不胜感激提供的任何帮助.

@property (nonatomic, strong) GPUImageView *gpuPlayerView;
@property (nonatomic, strong) GPUImageMovie *gpuMovie;

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.video];

self.player = [AVPlayer playerWithPlayerItem:playerItem];

self.player.actionAtItemEnd = AVPlayerActionAtItemEndNone;

[self.player play];

self.gpuMovie = [[GPUImageMovie alloc]initWithPlayerItem:playerItem];

self.gpuMovie.playAtActualSpeed = YES;

GPUImagePicture *sourcePicture1 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"FilterBG"]];

GPUImagePicture *sourcePicture2 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"FilterOverlay"]];

GPUImagePicture *sourcePicture3 = [[GPUImagePicture alloc]initWithImage:
[UIImage imageNamed:@"Filter1Map"]];

GPUImageFilter *filter = [[GPUImageFourInputFilter alloc]initWithFragmentShaderFromString:
kFilter1ShaderString];

[self.gpuMovie addTarget:filter atTextureLocation:0];

if (sourcePicture1)
{
    [sourcePicture1 addTarget:filter atTextureLocation:1];
}

if (sourcePicture2)
{
    [sourcePicture2 addTarget:filter atTextureLocation:2];
}

if (sourcePicture3)
{
    [sourcePicture3 addTarget:filter atTextureLocation:3];
}

[filter addTarget:self.gpuPlayerView];

[self.gpuMovie startProcessing];

最佳答案

有一种方法可以实现同样的效果

CVOpenGLESTextureCacheCreateTextureFromImage

它允许在 GL 纹理和电影之间共享缓冲区。 在此核心视频中,OpenGLES 纹理缓存用于缓存和管理 CVOpenGLESTextureRef 纹理。这些纹理缓存为您提供了一种直接从 GLES 读取和写入具有各种像素格式(例如 420v 或 BGRA)的缓冲区的方法。

//Mapping a BGRA buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGBA, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping a BGRA buffer as a renderbuffer:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_RENDERBUFFER, GL_RGBA8_OES, width, height, GL_RGBA, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping the luma plane of a 420v buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE, width, height, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &outTexture);
//Mapping the chroma plane of a 420v buffer as a source texture:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, width/2, height/2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &outTexture);
//Mapping a yuvs buffer as a source texture (note: yuvs/f and 2vuy are unpacked and resampled -- not colorspace converted)
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGB_422_APPLE, width, height, GL_RGB_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, 1, &outTexture);

    CVReturn CVOpenGLESTextureCacheCreateTextureFromImage ( CFAllocatorRef
    __nullable allocator, CVOpenGLESTextureCacheRef __nonnull textureCache, CVImageBufferRef __nonnull sourceImage, CFDictionaryRef
    __nullable textureAttributes, GLenum target, GLint internalFormat, GLsizei width, GLsizei height, GLenum format, GLenum type, size_t planeIndex, CVOpenGLESTextureRef __nullable * __nonnull textureOut );

此函数创建一个新的或返回映射到 CVImageBufferRef 和相关参数的缓存 CVOpenGLESTextureRef 纹理对象。此操作在图像缓冲区和底层纹理对象之间创建实时绑定(bind)。

我希望这能帮助您创建所需的内容:)

关于ios - GPUImageMovie 使用多个图像作为纹理和过程,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31149684/

相关文章:

ios - 在 iOS PWA 和 safari 之间共享 IndexedDB

ios - 使用未解析的标识符 'PFUser' - Swift/Xcode/Parse

ios - 如何知道是否在 Xcode 中触摸了 .png 的唯一可见区域

ios - 如何在 iOS 的 UITableViewCell 中播放视频(自动播放)

ios - 如何在我的 iOS 应用程序中播放远程 .mp3 文件?

ios - 了解 UIView.animate 以及完成闭包的工作原理

ios - 根据键的值从 NSDictionary 的 NSMutableArray 中删除重复项

objective-c - 如何通过点击动画 UIImageview 以显示全屏?

ios - 如何多次使用audioPlayerDidFinishPlaying

swift - View里面的View吃了同样的位置