ios - 我无法将 AVFoundation 视频保存到本地 URL

标签 ios objective-c avfoundation video-capture avcapturemoviefileoutput

我是编程和 Objective-C 的新手(大约 6 周),现在我是第一次使用 AVFoundation。我的目标是扩展我的水平,但对于熟悉该框架的人来说应该不会太难。

我的目标是创建一个“Snapchat”风格的自定义相机界面,当您点击按钮时捕捉静止图像,并在您按住按钮时录制视频。

我已经能够将大部分代码(视频预览、捕获静止图像、编程按钮等)拼凑在一起并粉碎,但我无法在本地成功保存视频(将在本周晚些时候将其添加到基于 Parse 构建的项目中)。

ViewController.h (引用)

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController 

@property UIButton *button;
@property UIButton *saveButton;
@property UIImageView *previewView;

#define VIDEO_FILE @"test.mov"

@end

ViewController.m

我构建代码的方式是在第一组方法中初始化 session ,然后将图像和视频捕获分成各自独立的部分。输入设备是AVMediaTypeVideo,输出到AVCaptureStillImageOutput和AVCaptureMovieFileOutput。

    #import "ViewController.h"

@interface ViewController () <AVCaptureFileOutputRecordingDelegate>

@end

@implementation ViewController
AVCaptureSession *session;
AVCaptureStillImageOutput *imageOutput;
AVCaptureMovieFileOutput *movieOutput;
AVCaptureConnection *videoConnection;

- (void)viewDidLoad {
    [super viewDidLoad];
    [self testDevices];
    self.view.backgroundColor = [UIColor blackColor];

    //Image preview
    self.previewView = [[UIImageView alloc]initWithFrame:self.view.frame];
    self.previewView.backgroundColor = [UIColor whiteColor];
    self.previewView.contentMode = UIViewContentModeScaleAspectFill;
    self.previewView.hidden = YES;
    [self.view addSubview:self.previewView];

    //Buttons
    self.button = [self createButtonWithTitle:@"REC" chooseColor:[UIColor redColor]];
    UILongPressGestureRecognizer *longPressRecognizer = [[UILongPressGestureRecognizer alloc]initWithTarget:self action:@selector(handleLongPressGesture:)];
    [self.button addGestureRecognizer:longPressRecognizer];
    [self.button addTarget:self action:@selector(captureImage) forControlEvents:UIControlEventTouchUpInside];

    self.saveButton = [self createSaveButton];
    [self.saveButton addTarget:self action:@selector(saveActions) forControlEvents:UIControlEventTouchUpInside];
}

- (void)viewWillAppear:(BOOL)animated {
    //Tests
    [self initializeAVItems];
    NSLog(@"%@", videoConnection);
    NSLog(@"%@", imageOutput.connections);
    NSLog(@"%@", imageOutput.description.debugDescription);
}

#pragma mark - AV initialization

- (void)initializeAVItems {
    //Start session, input
    session = [[AVCaptureSession alloc]init];
    [session setSessionPreset:AVCaptureSessionPresetPhoto];

    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error;
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
    if ([session canAddInput:deviceInput]) {
        [session addInput:deviceInput];
    } else {
        NSLog(@"%@", error);
    }

    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:session];
    [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    //Layer preview
    CALayer *viewLayer = [[self view] layer];
    [viewLayer setMasksToBounds:YES];

    CGRect frame = self.view.frame;
    [previewLayer setFrame:frame];
    [viewLayer insertSublayer:previewLayer atIndex:0];

    //Image Output
    imageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *imageOutputSettings = [[NSDictionary alloc]initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
    imageOutput.outputSettings = imageOutputSettings;

    //Video Output
    movieOutput = [[AVCaptureMovieFileOutput alloc] init];


    [session addOutput:movieOutput];
    [session addOutput:imageOutput];
    [session startRunning];
}

- (void)testDevices {
    NSArray *devices = [AVCaptureDevice devices];
    for (AVCaptureDevice *device in devices) {
        NSLog(@"Device name: %@", [device localizedName]);
        if ([device hasMediaType:AVMediaTypeVideo]) {
            if ([device position] == AVCaptureDevicePositionBack) {
                NSLog(@"Device position : back");
            }
            else {
                NSLog(@"Device position : front");
            }
        }
    }
}

#pragma mark - Image capture

- (void)captureImage {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in imageOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) {
            break;
        }
    }
    NSLog(@"Requesting capture from: %@", imageOutput);
    [imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        if (imageDataSampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [UIImage imageWithData:imageData];
            self.previewView.image = image;
            self.previewView.hidden = NO;
        }
    }];
    [self saveButtonFlyIn:self.saveButton];
}

#pragma mark - Video capture

- (void)captureVideo {
    NSLog(@"%@", movieOutput.connections);
    [[NSFileManager defaultManager] removeItemAtURL:[self outputURL] error:nil];

    videoConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:movieOutput.connections];

    /* This is where the code is breaking */
    [movieOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];

- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections {
    for (AVCaptureConnection *connection in connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:mediaType]) {
                return connection;
            }
        }
    }
    return nil;
}

#pragma mark - AVCaptureFileOutputRecordingDelegate

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
    if (!error) {
        //Do something
    } else {
        NSLog(@"Error: %@", [error localizedDescription]);
    }
}

#pragma mark - Recoding Destination URL

- (NSURL *)outputURL {
    NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSString *filePath = [documentsDirectory stringByAppendingPathComponent:VIDEO_FILE];
    return [NSURL fileURLWithPath:filePath];
}

#pragma mark - Buttons

- (void)handleLongPressGesture:(UILongPressGestureRecognizer *)recognizer {
    if (recognizer.state == UIGestureRecognizerStateBegan) {
        NSLog(@"Press");
        self.button.backgroundColor = [UIColor greenColor];
        [self captureVideo];
    }
    if (recognizer.state == UIGestureRecognizerStateEnded) {
        NSLog(@"Unpress");
        self.button.backgroundColor = [UIColor redColor];
    }
}

- (UIButton *)createButtonWithTitle:(NSString *)title chooseColor:(UIColor *)color {
    UIButton *button = [[UIButton alloc] initWithFrame:CGRectMake(self.view.frame.size.width - 100, self.view.frame.size.height - 100, 85, 85)];
    button.layer.cornerRadius = button.bounds.size.width / 2;
    button.backgroundColor = color;
    button.tintColor = [UIColor whiteColor];
    [self.view addSubview:button];
    return button;
}

- (UIButton *)createSaveButton {
    UIButton *button = [[UIButton alloc]initWithFrame:CGRectMake(self.view.frame.size.width, 15, 85, 85)];
    button.layer.cornerRadius = button.bounds.size.width / 2;
    button.backgroundColor = [UIColor greenColor];
    button.tintColor = [UIColor whiteColor];
    button.userInteractionEnabled = YES;
    [button setTitle:@"save" forState:UIControlStateNormal];
    [self.view addSubview:button];
    return button;
}

- (void)saveButtonFlyIn:(UIButton *)button {
    CGRect movement = button.frame;
    movement.origin.x = self.view.frame.size.width - 100;

    [UIView animateWithDuration:0.2 animations:^{
        button.frame = movement;
    }];
}

- (void)saveButtonFlyOut:(UIButton *)button {
    CGRect movement = button.frame;
    movement.origin.x = self.view.frame.size.width;

    [UIView animateWithDuration:0.2 animations:^{
        button.frame = movement;
    }];
}

#pragma mark - Save actions

- (void)saveActions {
    [self saveButtonFlyOut:self.saveButton];
    self.previewView.image = nil;
    self.previewView.hidden = YES;
}

@end

代码在这一行中断:

[movieOutput startRecordingToOutputFileURL:[self outputURL] recordingDelegate:self];

在我的脑海中,我认为这可能是两件事:

  1. 数据是否存在(已记录,但无法验证)?
  2. 我是否正确地初始化了目标网址?
  3. 数据是否与目的地兼容?是这样吗?

希望您对如何检查、测试或调试它的观点/新鲜的眼光/想法。

干杯, J

最佳答案

问题出在你对-initializeAVItems的实现上:

- (void)initializeAVItems {
    //Start session, input
    session = [[AVCaptureSession alloc]init];
    [session setSessionPreset:AVCaptureSessionPresetPhoto];
    ...
}

如果您想使用AVCaptureMovieFileOutput录制视频,则不能将AVCaptureSessionsessionPreset设置为AVCaptureSessionPresetPhoto,仅适用于静态图像。对于高质量的视频输出,我建议使用 AVCaptureSessionPresetHigh

最好在真正设置之前调用canSetSessionPreset::

session = [AVCaptureSession new];
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
    session.sessionPreset = AVCaptureSessionPresetHigh;
}

关于ios - 我无法将 AVFoundation 视频保存到本地 URL,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30971132/

相关文章:

ios - 在 iOS 上开始在后台播放声音?

ios - 模拟长时间曝光 iOS

iphone - 如何像在 Messages.app 中一样向下移动 iPhone 键盘?

ios - Chartboost 委托(delegate)集成警告

swift - 使用 AVAudioSourceNode 播放声音时的噪音

objective-c - 隐藏 TableView ,直到加载所有数据 - IOS

iphone - 发送json数据到服务器

iphone - 多次调用 didUpdateToLocation

ios - 线程 1 : Signal SIGABRT (Could not cast value of type to 'SignInViewController' to 'AWSCognitoIdentityPasswordAuthentication' )

ios - UITableview 搜索解析 - 无结果