我创建了一个函数,可以在保持纵横比的同时将图像大小调整为最大宽度和高度。此外,我还根据压缩质量来压缩图像 - 我已经使用 11.7 mb 的 3024x4032 jpeg 图像对此进行了测试。
maxWidth = 800px
maxHeight = 1200px
compressionQuality = 0.5
该函数确实将图像大小从 11.7mb 减小到 0.51mb,但宽度和高度并未正确减小。上传到 Firebase 后,图像大小为 1600 x 2134px 两倍...但应该为 800x1066px(一半)
你能看出哪里出了问题吗?
import UIKit
import Foundation
class ImageEdit {
static let instance = ImageEdit()
func resizeAndCompressImageWith(image: UIImage, maxWidth: CGFloat, maxHeight: CGFloat, compressionQuality: CGFloat) -> Data? {
let horizontalRatio = maxWidth / image.size.width
let verticalRatio = maxHeight / image.size.height
let ratio = min(horizontalRatio, verticalRatio)
let newSize = CGSize(width: image.size.width * ratio, height: image.size.height * ratio)
var newImage: UIImage
if #available(iOS 10.0, *) {
let renderFormat = UIGraphicsImageRendererFormat.default()
renderFormat.opaque = false
let renderer = UIGraphicsImageRenderer(size: CGSize(width: newSize.width, height: newSize.height), format: renderFormat)
newImage = renderer.image {
(context) in
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
}
} else {
UIGraphicsBeginImageContextWithOptions(CGSize(width: newSize.width, height: newSize.height), true, 0)
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
newImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
}
let data = UIImageJPEGRepresentation(newImage, compressionQuality)
return data
}
}
这是将图像上传到 firebase 的代码。
func uploadImageToFirebaseAndReturnImageURL(directory: String, image: UIImage!, maxWidth: CGFloat, maxHeight: CGFloat, compressionQuality: CGFloat, handler: @escaping(_ imageURL: String) -> ()) {
let imageName = NSUUID().uuidString // create unique image name
if let uploadData = ImageEdit.instance.resizeAndCompressImageWith(image: image, maxWidth: maxWidth, maxHeight: maxHeight, compressionQuality: compressionQuality) {
DB_STORE.child(directory).child(imageName).putData(uploadData, metadata: nil, completion: { (metadata, error) in
if error != nil {
print(error ?? "Image upload failed for unknown reason")
return
}
// if URL exist, then return imageURL
if let imageURL = metadata?.downloadURL()?.absoluteString {
handler (imageURL)
}
return
})
}
}
最佳答案
我将您的代码复制到测试项目中,并添加了一些打印语句,并尝试调整原始尺寸为 3360x2108 的图像大小。 (注意:我在此测试代码中使用强制展开,但不建议将其用于任何生产代码)。
这是我调用调整大小代码的函数:
func resizeImage() {
guard let image = UIImage.init(named: "landscape") else {
return
}
print("Original Image Size: width: \(image.size.width) height: \(image.size.height)")
let _ = ImageEdit.instance.resizeAndCompressImageWith(image: image, maxWidth: 800.0, maxHeight: 1200.0, compressionQuality: 0.5)
}
这是我调整大小代码的更新版本。我只是在最后添加一些代码来实例化一些图像实例,以在转换后注销它们的实际大小:
import UIKit
class ImageEdit {
static let instance = ImageEdit()
func resizeAndCompressImageWith(image: UIImage, maxWidth: CGFloat, maxHeight: CGFloat, compressionQuality: CGFloat) -> Data? {
let horizontalRatio = maxWidth / image.size.width
let verticalRatio = maxHeight / image.size.height
let ratio = min(horizontalRatio, verticalRatio)
print("Image Ratio: \(ratio)")
let newSize = CGSize(width: image.size.width * ratio, height: image.size.height * ratio)
print("NewSize: \(String(describing: newSize))")
var newImage: UIImage
if #available(iOS 10.0, *) {
print("UIGraphicsImageRendererFormat")
let renderFormat = UIGraphicsImageRendererFormat.default()
renderFormat.opaque = false
let renderer = UIGraphicsImageRenderer(size: CGSize(width: newSize.width, height: newSize.height), format: renderFormat)
newImage = renderer.image {
(context) in
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
}
} else {
print("UIGraphicsBeginImageContextWithOptions")
UIGraphicsBeginImageContextWithOptions(CGSize(width: newSize.width, height: newSize.height), true, 0)
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
newImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
}
print("NewImageSize: width: \(newImage.size.width) height: \(newImage.size.height)")
let png = UIImagePNGRepresentation(newImage)
let pngImg = UIImage.init(data: png!)!
print("PNG - width: \(pngImg.size.width) - height: \(pngImg.size.height)")
let data = UIImageJPEGRepresentation(newImage, compressionQuality)
let jpgImg = UIImage.init(data: data!)!
print("JPG - width: \(jpgImg.size.width) - height: \(jpgImg.size.height)")
let fullData = UIImageJPEGRepresentation(newImage, 1.0)
let jpgFull = UIImage.init(data: fullData!)!
print("JPG FULL - width: \(jpgFull.size.width) - height: \(jpgFull.size.height)")
return data
}
}
当在 iOS 11 的模拟器上运行它时,我将其记录到调试器中:
Original Image Size: width: 3360.0 height: 2108.0
Image Ratio: 0.238095238095238
NewSize: (800.0, 501.904761904762)
UIGraphicsImageRendererFormat
NewImageSize: width: 800.0 height: 502.0
PNG - width: 2400.0 - height: 1506.0
JPG - width: 2400.0 - height: 1506.0
JPG FULL - width: 2400.0 - height: 1506.0
如果我注释掉你的
if #available(iOS 10.0, *) {
block 我仍然看到相同的测量结果。
看来newImage是直接生成的
UIGraphicsImageRendererFormat
or
UIGraphicsBeginImageContextWithOptions
生成具有您指定大小的图像。但是,由于某种原因,通过
运行该图像UIImagePNGRepresentation(newImage)
or
UIImageJPEGRepresentation(newImage, compressionQuality)
生成的图像尺寸比原始图像大 3 倍。即使我更新了
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
to
UIGraphicsBeginImageContextWithOptions(size, false, UIScreen.main.scale)
这似乎并不重要。
在我的测试用例中 UIScreen.main.scale = 3.0。
看来,通过 PNG 或 JPEG 表示方法转换图像会将其乘以这些函数生成的最终图像大小的 UIScreen.main.scale 值。
关于swift - 在上传到 Google Cloud 存储之前调整图像大小并压缩图像,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48288430/