ios - 如何裁剪检测到的脸部

标签 ios crop face-detection core-image

我使用CoreImage来检测人脸。我想在检测到脸部后修剪脸部。我使用以下代码片段检测人脸:

-(void)markFaces:(UIImageView *)facePicture{


CIImage* image = [CIImage imageWithCGImage:imageView.image.CGImage];

CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
                                          context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];


NSArray* features = [detector featuresInImage:image];


CGAffineTransform transform = CGAffineTransformMakeScale(1, -1);
transform = CGAffineTransformTranslate(transform, 0, -imageView.bounds.size.height);


for(CIFaceFeature* faceFeature in features)
{
    // Get the face rect: Translate CoreImage coordinates to UIKit coordinates
    const CGRect faceRect = CGRectApplyAffineTransform(faceFeature.bounds, transform);


    faceView = [[UIView alloc] initWithFrame:faceRect];
    faceView.layer.borderWidth = 1;
    faceView.layer.borderColor = [[UIColor redColor] CGColor];


    UIGraphicsBeginImageContext(faceView.bounds.size);
    [faceView.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    //Blur the UIImage with a CIFilter
    CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
    CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
    [gaussianBlurFilter setValue:imageToBlur forKey: @"inputImage"];
    [gaussianBlurFilter setValue:[NSNumber numberWithFloat: 10] forKey: @"inputRadius"];
    CIImage *resultImage = [gaussianBlurFilter valueForKey: @"outputImage"];
    UIImage *endImage = [[UIImage alloc] initWithCIImage:resultImage];

    //Place the UIImage in a UIImageView
    UIImageView *newView = [[UIImageView alloc] initWithFrame:self.view.bounds];
    newView.image = endImage;
    [self.view addSubview:newView];

    CGFloat faceWidth = faceFeature.bounds.size.width;

    [imageView addSubview:faceView];

    // LEFT EYE
    if(faceFeature.hasLeftEyePosition)
    {

        const CGPoint leftEyePos = CGPointApplyAffineTransform(faceFeature.leftEyePosition, transform);

        UIView *leftEyeView = [[UIView alloc] initWithFrame:CGRectMake(leftEyePos.x - faceWidth*EYE_SIZE_RATE*0.5f,
                                                                       leftEyePos.y - faceWidth*EYE_SIZE_RATE*0.5f
                                                                       ,faceWidth*EYE_SIZE_RATE,
                                                                       faceWidth*EYE_SIZE_RATE)];

        NSLog(@"Left Eye X = %0.1f Y = %0.1f Width = %0.1f Height = %0.1f",leftEyePos.x - faceWidth*EYE_SIZE_RATE*0.5f,
              leftEyePos.y - faceWidth*EYE_SIZE_RATE*0.5f,faceWidth*EYE_SIZE_RATE,
              faceWidth*EYE_SIZE_RATE);

        leftEyeView.backgroundColor = [[UIColor magentaColor] colorWithAlphaComponent:0.3];
        leftEyeView.layer.cornerRadius = faceWidth*EYE_SIZE_RATE*0.5;


        [imageView addSubview:leftEyeView];
    }


    // RIGHT EYE
    if(faceFeature.hasRightEyePosition)
    {

        const CGPoint rightEyePos = CGPointApplyAffineTransform(faceFeature.rightEyePosition, transform);


        UIView *rightEye = [[UIView alloc] initWithFrame:CGRectMake(rightEyePos.x - faceWidth*EYE_SIZE_RATE*0.5,
                                                                    rightEyePos.y - faceWidth*EYE_SIZE_RATE*0.5,
                                                                    faceWidth*EYE_SIZE_RATE,
                                                                    faceWidth*EYE_SIZE_RATE)];



        NSLog(@"Right Eye X = %0.1f Y = %0.1f Width = %0.1f Height = %0.1f",rightEyePos.x - faceWidth*EYE_SIZE_RATE*0.5f,
              rightEyePos.y - faceWidth*EYE_SIZE_RATE*0.5f,faceWidth*EYE_SIZE_RATE,
              faceWidth*EYE_SIZE_RATE);

        rightEye.backgroundColor = [[UIColor blueColor] colorWithAlphaComponent:0.2];
        rightEye.layer.cornerRadius = faceWidth*EYE_SIZE_RATE*0.5;
        [imageView addSubview:rightEye];
    }


    // MOUTH
    if(faceFeature.hasMouthPosition)
    {

        const CGPoint mouthPos = CGPointApplyAffineTransform(faceFeature.mouthPosition, transform);


        UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(mouthPos.x - faceWidth*MOUTH_SIZE_RATE*0.5,
                                                                 mouthPos.y - faceWidth*MOUTH_SIZE_RATE*0.5,
                                                                 faceWidth*MOUTH_SIZE_RATE,
                                                                 faceWidth*MOUTH_SIZE_RATE)];

        NSLog(@"Mouth X = %0.1f Y = %0.1f Width = %0.1f Height = %0.1f",mouthPos.x - faceWidth*MOUTH_SIZE_RATE*0.5f,
              mouthPos.y - faceWidth*MOUTH_SIZE_RATE*0.5f,faceWidth*MOUTH_SIZE_RATE,
              faceWidth*MOUTH_SIZE_RATE);


        mouth.backgroundColor = [[UIColor greenColor] colorWithAlphaComponent:0.3];
        mouth.layer.cornerRadius = faceWidth*MOUTH_SIZE_RATE*0.5;
        [imageView addSubview:mouth];

    }
}
}

我要的只是脸。

最佳答案

您可以使用此功能轻松裁剪脸部。此功能经过测试,可以正常工作。

-(void)faceWithFrame:(CGRect)frame{
    CGRect rect = frame;
    CGImageRef imageRef = CGImageCreateWithImageInRect([self.imageView.image CGImage], rect);
    UIImage *cropedImage = [UIImage imageWithCGImage:imageRef];
    self.cropedImg.image =cropedImage;
}

您只需传递人脸框,以上功能将给出裁剪的人脸图像。

关于ios - 如何裁剪检测到的脸部,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/23097468/

相关文章:

iOS 人脸检测位置

java和haarcascade人脸和嘴巴检测——嘴当 Nose

ios - 如何自定义谷歌登录按钮?

ios - 从 iOS + Bonjour 使用 Google Oauth2 webflow(赞!)

c# - 如何从 tiff 图像中裁剪一部分

javascript - 裁剪后的图像高度和宽度未固定为 160px

qt - 如何以亚像素精度平移/裁剪 QImage?

ios - 如何让用户复制 View

ios - TableView :numberOfRowsInSection:]: unrecognized selector sent to instance

iPhone OpenCV 人脸检测