hadoop - hadoop流在简单1.0.0中失败,具有简单的映射并减少作业(使用nltk代码)

标签 hadoop

我的执行代码和输出

[hduser@Janardhan hadoop]$ bin/hadoop jar contrib/streaming/hadoop-streaming-1.0.0.jar -file /home/hduser/mapper.py -mapper mapper.py -file /home/hduser/reducer.py -reducer reducer.py -input /user/hduser/input.txt -output /home/hduser/outpututttt


 Warning: $HADOOP_HOME is deprecated.

    packageJobJar: [/home/hduser/mapper.py, /home/hduser/reducer.py, /app/hadoop/tmp/hadoop-unjar2185859252991058106/] [] /tmp/streamjob2973484922110272968.jar tmpDir=null
    12/05/03 20:36:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    12/05/03 20:36:03 INFO streaming.StreamJob: getLocalDirs(): [/app/hadoop/tmp/mapred/local]
    12/05/03 20:36:03 INFO streaming.StreamJob: Running job: job_201205032014_0003
    12/05/03 20:36:03 INFO streaming.StreamJob: To kill this job, run:
    12/05/03 20:36:03 INFO streaming.StreamJob: /usr/local/hadoop/libexec/../bin/hadoop job  -Dmapred.job.tracker=localhost:54311 -kill job_201205032014_0003
    12/05/03 20:36:03 INFO streaming.StreamJob: Tracking URL: http://localhost.localdomain:50030/jobdetails.jsp?jobid=job_201205032014_0003
    12/05/03 20:36:04 INFO streaming.StreamJob:  map 0%  reduce 0%
    12/05/03 20:36:21 INFO streaming.StreamJob:  map 100%  reduce 0%
    12/05/03 20:36:24 INFO streaming.StreamJob:  map 0%  reduce 0%
    12/05/03 20:37:00 INFO streaming.StreamJob:  map 100%  reduce 100%
    12/05/03 20:37:00 INFO streaming.StreamJob: To kill this job, run:
    12/05/03 20:37:00 INFO streaming.StreamJob: /usr/local/hadoop/libexec/../bin/hadoop job  -Dmapred.job.tracker=localhost:54311 -kill job_201205032014_0003
    12/05/03 20:37:00 INFO streaming.StreamJob: Tracking URL: http://localhost.localdomain:50030/jobdetails.jsp?jobid=job_201205032014_0003
    12/05/03 20:37:00 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201205032014_0003_m_000000
    12/05/03 20:37:00 INFO streaming.StreamJob: killJob...
    Streaming Job Failed! 

这是我从作业跟踪器收到的错误:
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:311)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:545)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:132)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:36)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1083)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)

它在本地使用此代码:
 [hduser@Janardhan ~]$ cat input.txt | ./mapper.py | sort | ./reducer.py 
('be', 'VB')    1
('ceremony', 'NN')  1
('first', 'JJ')     2
('for', 'IN')   2
('hi', 'NN')    1
('place', 'NN')     1
('the', 'DT')   2
('welcome', 'VBD')  1

最佳答案

您需要通过检查映射和化简任务失败的数据节点上的stderr日志进行调试。当本地运行的作业在群集上失败时,这些通常会引起很多注意。

您应该能够通过hadoop群集的jobtracker Web界面(通常在http://master.node.ip.address:50030/jobtracker.jsp处)访问日志。您的工作应显示在“失败的工作”下。单击作业ID,然后单击“失败”列中的 map 或化简任务,您应该会看到日志。

请注意,如果mapper.py和reducer.py无法执行(第一行#!/ usr / bin / python,文件属性设置正确),则可能需要将参数更改为“-mapper'python mapper.py'”,等等

关于hadoop - hadoop流在简单1.0.0中失败,具有简单的映射并减少作业(使用nltk代码),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10434346/

相关文章:

hadoop - 将数据/文件从 Windows 复制到 Linux 机器或 HDFS

hadoop - distcp和s3distcp之间的临时存储使用

Hadoop Map-Reduce OutputFormat 用于将结果分配给内存变量(不是文件)?

hadoop - 自动控制正在运行的类似Oozie作业的数量

hadoop - 使用sqoop将数据从Oracle导入到HDFS

r - 如何增强 lpsolve R 优化解决方案以在 hadoop 集群上运行?

amazon-web-services - AWS EMR 引导操作作为 sudo

hadoop - Sqoop:创建链接失败并出现数字格式异常

hadoop - 从PIG中的别名加载数据

hadoop - pig 无法在vim中创建样本编号文件