我正在尝试在 python 项目中使用 AWS 机器学习批处理。我用的是boto3。我在响应中收到此失败消息。
There was an error trying to parse the schema: \'Can not deserialize instance of boolean out of START_ARRAY token\n at [Source: java.io.StringReader@60618eb4; line: 1, column: 2] (through reference chain: com.amazon.eml.dp.recordset.SchemaPojo["dataFileContainsHeader"])\
我正在使用的 .csv 文件有效。我知道这一点是因为它是通过控制台进程工作的。
这是我的代码;它是 django 模型中的一个函数,它保存要处理的文件的 url (input_file):
def create_data_source_from_s3(self):
attributes = []
attribute = { "fieldName": "Var1", "fieldType": "CATEGORICAL" }
attributes.append(attribute)
attribute = { "fieldName": "Var2", "fieldType": "CATEGORICAL" }
attributes.append(attribute)
attribute = { "fieldName": "Var3", "fieldType": "NUMERIC" }
attributes.append(attribute)
attribute = { "fieldName": "Var4", "fieldType": "CATEGORICAL" }
attributes.append(attribute)
attribute = { "fieldName": "Var5", "fieldType": "CATEGORICAL" }
attributes.append(attribute)
attribute = { "fieldName": "Var6", "fieldType": "CATEGORICAL" }
attributes.append(attribute)
dataSchema = {}
dataSchema['version'] = '1.0'
dataSchema['dataFormat'] = 'CSV'
dataSchema['attributes'] = attributes
dataSchema["targetFieldName"] = "Var6"
dataSchema["dataFileContainsHeader"] = True,
json_data = json.dumps(dataSchema)
client = boto3.client('machinelearning', region_name=settings.region, aws_access_key_id=settings.aws_access_key_id, aws_secret_access_key=settings.aws_secret_access_key)
#create a datasource
return client.create_data_source_from_s3(
DataSourceId=self.input_file.name,
DataSourceName=self.input_file.name,
DataSpec={
'DataLocationS3': 's3://' + settings.AWS_S3_BUCKET_NAME + '/' + self.input_file.name,
'DataSchema': json_data,
},
ComputeStatistics=True
)
知道我做错了什么吗?
最佳答案
删除逗号
dataSchema["dataFileContainsHeader"] = True,
这导致 Python 认为您正在添加一个元组。所以你的 dataSchema 实际上包含 (True, )
你的输出看起来像这样
{"dataFileContainsHeader": [true], "attributes": [{"fieldName": "Var1", "fieldType": "CATEGORICAL"}, {"fieldName": "Var2", "fieldType": "CATEGORICAL"}, {"fieldName": "Var3", "fieldType": "NUMERIC"}, {"fieldName": "Var4", "fieldType": "CATEGORICAL"}, {"fieldName": "Var5", "fieldType": "CATEGORICAL"}, {"fieldName": "Var6", "fieldType": "CATEGORICAL"}], "version": "1.0", "dataFormat": "CSV", "targetFieldName": "Var6"}
AWS 正在期待这样的事情
"dataFileContainsHeader": true
关于python 调用 boto3.client.create_data_source_from_s3,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/42506639/