ios - AVMutableComposition的AVPlayer无法播放音频和视频

标签 ios audio video avplayer avmutablecomposition

我正在尝试从视频和音频的组合中显示视频。但是,一旦视频状态从未达到AVPlayerStatusReadyToPlay,我似乎就遇到了问题。

如果我将视频 Assets 或音频 Assets 直接包含在播放器项目中,它将起作用。因此,我知道 Assets 没有问题。

这是我的代码:

       - (void) loadPlayer {
            NSURL *videoURL = **;
            AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];

            NSURL *audioURL = **;
            AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:nil];


            NSArray *keys = [NSArray arrayWithObject:@"duration"];
            [videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {

                NSError *error = nil;
                AVKeyValueStatus durationStatus = [videoAsset statusOfValueForKey:@"duration" error:&error];

                switch (durationStatus) {
                    case AVKeyValueStatusLoaded:;
                        _videoDuration = videoAsset.duration;
                        if (_audioDuration.flags == kCMTimeFlags_Valid) {
                            [self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
                        }
                        break;
                }
            }];

            [audioAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {

                NSError *error = nil;
                AVKeyValueStatus durationStatus = [audioAsset statusOfValueForKey:@"duration" error:&error];

                switch (durationStatus) {
                    case AVKeyValueStatusLoaded:;
                        _audioDuration = audioAsset.duration;
                        if (_videoDuration.flags == kCMTimeFlags_Valid) {
                            [self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
                        }
                        break;
                }
            }];
        }

        - (void) loadPlayWithVideoAsset:(AVURLAsset *)videoAsset withDuration:(CMTime)videoDuration andAudioAsset:(AVURLAsset *)audioAsset withDuration:(CMTime)audioDuration {


        AVMutableComposition *composition = [AVMutableComposition composition];

        //Video
        AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
        NSError *videoError = nil;
        if (![compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoDuration)
                                            ofTrack:videoTrack
                                             atTime:kCMTimeZero
                                              error:&videoError])  {
            NSLog(@"videoError: %@",videoError);
        }



        //Audio
        AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
        NSError *audioError = nil;
        if (![compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioDuration)
                                            ofTrack:audioTrack
                                             atTime:kCMTimeZero
                                              error:&audioError]) {
            NSLog(@"audioError: %@",audioError);
        }


        NSInteger compare = CMTimeCompare(videoDuration, audioDuration);

        if (compare == 1) {
            //The video is larger
            CMTime timeDiff = CMTimeSubtract(videoDuration, audioDuration);
            [compositionAudioTrack insertEmptyTimeRange:CMTimeRangeMake(audioDuration, timeDiff)];
        }
        else {
            CMTime timeDiff = CMTimeSubtract(audioDuration, videoDuration);
            [compositionVideoTrack insertEmptyTimeRange:CMTimeRangeMake(videoDuration, timeDiff)];
        }
        AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
        self.mPlayer = [AVPlayer playerWithPlayerItem:playerItem];
        self.mPlaybackView = [[AVPlayerPlaybackView alloc] initWithFrame:CGRectZero];
        [self.view addSubview:self.mPlaybackView];
        [self.mPlayer addObserver:self forKeyPath:@"status" options:0 context:AVPlayerPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
    if (self.mPlayer.status == AVPlayerStatusReadyToPlay) {
        [self.mPlaybackView setPlayer:self.mPlayer];
        isReadyToPlay = YES;
        _playVideoBtn.hidden = NO;
    }
}
- (void) playVideo {
    if (YES || isReadyToPlay) {
        [self.mPlayer play];
    }
}

最佳答案

根据我的经验,仅当资源/视频与应用程序 bundle 在一起时,AVPlayer才能与AVMutableComposition一起使用。如果视频资源在网络上,则尽管AVPlayerItem和AVPlayer将状态报告为“准备播放”,但AVPlayer无法与AVMUtableComposition一起播放。

关于ios - AVMutableComposition的AVPlayer无法播放音频和视频,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/20724973/

相关文章:

iphone - 一个view显示多个UICollectionView,但是显示的item个数不对

ios - 在 Xcode 中将项目锁定为纵向?

c# - 在 NAudio 的内存流中将格式从 wav 更改为 mp3

ios - iOS 上的直播应用程序出现错误 849

image - JHipster : Blob/ImageBlob/VideoBlob/AudioBlob datatype

ios - Coreplot iOS - 图表条之间的颜色空间

ios - Swift 2 中的 Rest json POST 错误处理

ios - 按 BPM 将音轨分成多个片段,并使用 Superpowered iOS 分析每个片段

java - 如何使用jetty正确支持html5 <video>源

video - 如何在RTP中打包H264?