我有以下功能
async def _s3_copy_object(self, s3_source, s3_destination):
source_bucket, source_key = get_s3_bucket_and_key(s3_source)
destination_bucket, destination_key = get_s3_bucket_and_key(s3_destination)
print("copying object: {} to {}".format(s3_source, s3_destination))
source = {'Bucket': source_bucket, 'Key': source_key}
await self._async_s3.copy_object(CopySource=source,
Bucket=destination_bucket, Key=destination_key,
ServerSideEncryption='AES256',
MetadataDirective='COPY',
TaggingDirective='COPY')
如果文件小于 5gb,这很有效,但如果对象超过 5gb,则失败。
我收到以下错误:
An error occurred (InvalidRequest) when calling the CopyObject operation: The specified copy source is larger than the maximum allowable size for a copy source: 5368709120: 1313
有解决办法吗?
最佳答案
您需要使用 boto3 copy
方法而不是 copy_object
。它将执行复制大于 5GB 的对象时所需的分段上传。它还将为您处理线程。
关于python - Boto3 Copy_Object 大小失败 > 5GB,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/52879356/