我需要转换一些日期格式并根据一些业务逻辑将其处理为 bool(boolean) 条件。
但是从Hive调用python脚本时出现问题。下面是我编写的用于转换示例1列的日期格式的脚本:
import sys
def getYearMonthFromStringDate(dt):
year=0
month=0
try:
ss=dt.split('-')
year=ss[0]
month=ss[1]
except ValueError:
print "Error parsing date string %s" %dt
return int(year)*100+int(month)
for line in sys.stdin:
tempArr=line.split('\t')
accountgl0s=tempArr[0]
agl0 = getYearMonthFromStringDate(accountgl0s)
output_list = [accountgl0s, ag10]
print '\t'.join(output_list)
我使用以下命令将文件添加到分布式缓存中:
add file /folder/date.py
现在,我使用Transform在 hive 表的col
accountgl0s
上调用此Python函数,如下所示:Input column accountgl0s = '2016-10-01'
select transform(accountgl0) using 'python date.py' as (accountgl0s,agl0) from sample;
我的预期输出应该是
2016-10-01 201610
。但是我收到以下错误:Error: java.lang.RuntimeException: Hive Runtime Error while closing operators
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:217)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: [Error 20003]: An error occurred when trying to close the Operator running your custom script.
at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:557)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:610)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:199)
... 8 more
FAILED: Execution Error, return code 20003 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. An error occurred when trying to close the Operator running your custom script.
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
最佳答案
要计算数字时,必须将变量的类型更改为float:
f_accountgl0s = float(accountgl0s)
关于python - 配置单元:python UDF提供 “Hive Runtime Error while closing operators”,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/40926276/