shell - 在oozie中计划/运行mahout命令

标签 shell hadoop oozie oozie-coordinator

我正在尝试使用oozie Scheduler运行mahout命令-sequence2sparse,但是它给出了一些错误。
我尝试使用oozie-shell标签运行mahout命令,但没有任何效果。

以下是oozie工作流程-

 <action name="mahoutSeq2Sparse">
      <shell xmlns="uri:oozie:shell-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <configuration>
            <property>
              <name>mapred.job.queue.name</name>
              <value>${queueName}</value>
            </property>
        </configuration>
        <exec>mahout seq2sparse</exec>
         <argument>-i</argument>
         <argument>${nameNode}/tmp/Clustering/seqOutput</argument>
         <argument>-o</argument>
         <argument>${nameNode}/tmp/Clustering/seqToSparse</argument>
         <argument>-ow</argument>
         <argument>-nv</argument>
         <argument>-x</argument>
         <argument>100</argument>
         <argument>-n</argument>
         <argument>2</argument>
         <argument>-wt</argument>
         <argument>tf</argument>
         <capture-output/>
     </shell>
 <ok to="brandCanopyInitialCluster" />
    <error to="fail" />
</action>

我还尝试通过创建shell脚本并在oozie中运行它
 <action name="mahoutSeq2Sparse">
       <shell xmlns="uri:oozie:shell-action:0.1">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <configuration>
            <property>
              <name>mapred.job.queue.name</name>
              <value>${queueName}</value>
            </property>
        </configuration>
        <exec>${EXEC}</exec>
        <file>${EXEC}#${EXEC}</file>
     </shell>

    <ok to="brandCanopyInitialCluster" />
    <error to="fail" />
</action>

与job.properties作为
nameNode=hdfs://abc02:8020
jobTracker=http://abc02:8050/
 clusteringJobInput=hdfs://abc02:8020/tmp/Activity/000000_0
queueName=default
oozie.wf.application.path=hdfs://abc02:8020/tmp/workflow/
oozie.use.system.libpath=true
EXEC=generatingBrandSparseFile.sh

并且generateBrandSparseFile.sh是
    export INPUT_PATH="hdfs://abc02:8020/tmp/Clustering/seqOutput"
export OUTPUT_PATH="hdfs://abc02:8020/tmp/Clustering/seqToSparse"


sudo -u hdfs hadoop fs -chmod -R 777     "hdfs://abc02:8020/tmp/Clustering/seqOutput"

mahout seq2sparse -i ${INPUT_PATH} -o ${OUTPUT_PATH} -ow -nv -x 100  -n 2 -wt tf
sudo -u hdfs hadoop fs -chmod -R 777 ${OUTPUT_PATH}

但所有选项均无效。
后者的错误是-

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] sudo: no tty present and no askpass program specified 15/06/05 12:23:59 WARN driver.MahoutDriver: No seq2sparse.props found on classpath, will use command-line arguments only 15/06/05 12:24:01 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum n-gram size is: 1



对于sudo: no tty present,我已将此错误注释掉
/ etc / sudoers-
默认值!requiretty

Mahout安装在安装oozie服务器的节点上。

同样,以下oozie工作流程无效-
<workflow-app xmlns="uri:oozie:workflow:0.4" name="map-reduce-wf">
<action name="mahoutSeq2Sparse">
       <ssh>
        <host>rootUserName@abc05.ad.abc.com<host>
        <command>mahout seq2sparse</command>
        <args>-i</arg>
        <args>${nameNode}/tmp/Clustering/seqOutput</arg>
        <args>-o</arg>
        <args>${nameNode}/tmp/Clustering/seqToSparse</arg>
        <args>-ow</args>
        <args>-nv</args>
        <args>-x</args>
        <args>100</args>
        <args>-n</args>
        <args>2</args>
        <args>-wt</args>
        <args>tf</args>
         <capture-output/>
      </ssh>

    <ok to="brandCanopyInitialCluster" />
    <error to="fail" />
</action>

错误-Error: E0701 : E0701: XML schema error, cvc-complex-type.2.4.a: Invalid content was found starting with element 'ssh'. One of '{"uri:oozie:workflow:0.4":map-reduce, "uri:oozie:workflow:0.4":pig, "uri:oozie:workflow:0.4":sub-workflow, "uri:oozie:workflow:0.4":fs, "uri:oozie:workflow:0.4":java, WC[##other:"uri:oozie:workflow:0.4"]}' is expected.
在所有节点上安装mahout会有所帮助吗?-(oozie可以在任何节点上运行脚本)。
有没有办法让mahout在hadoop集群上可用?

也欢迎任何其他解决方案。

提前致谢。

编辑:
我已经稍微改变了方法,现在我直接调用seq2sparse类。工作流程是-
 <action name="mahoutSeq2Sparse">
    <java>
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>

       <configuration>
            <property>
                <name>mapred.job.queue.name</name>
                <value>${queueName}</value>
            </property>

        </configuration>
            <main-class>org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles</main-class>
            <arg>-i</arg>
         <arg>${nameNode}/tmp/OozieData/Clustering/seqOutput</arg>
         <arg>-o</arg>
         <arg>${nameNode}/tmp/OozieData/Clustering/seqToSparse</arg>
         <arg>-ow</arg>
         <arg>-nv</arg>
         <arg>-x</arg>
         <arg>100</arg>
         <arg>-n</arg>
         <arg>2</arg>
         <arg>-wt</arg>
         <arg>tf</arg>

    </java>
    <ok to="CanopyInitialCluster"/>
    <error to="fail"/>
</action>

作业仍未运行,错误是
    >>> Invoking Main class now >>>

Main class        : org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles
Arguments         :
                    -i
                    hdfs://abc:8020/tmp/OozieData/Clustering/seqOutput
                    -o
                    hdfs://abc:8020/tmp/OozieData/Clustering/seqToSparse
                    -ow
                    -nv
                    -x
                    100
                    -n
                    2
                    -wt
                    tf

Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat
Heart beat

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.JavaMain], main() threw exception, java.lang.IllegalStateException: Job failed!
org.apache.oozie.action.hadoop.JavaMainException: java.lang.IllegalStateException: Job failed!
    at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
    at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: Job failed!
    at org.apache.mahout.vectorizer.DictionaryVectorizer.startWordCounting(DictionaryVectorizer.java:368)
    at org.apache.mahout.vectorizer.DictionaryVectorizer.createTermFrequencyVectors(DictionaryVectorizer.java:179)
    at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:288)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:56)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
    ... 15 more

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file: hdfs://vchniecnveg02:8020/user/root/oozie-oozi/0000054-150604142118313-oozie-oozi-W/mahoutSeq2Sparse--java/action-data.seq

Oozie Launcher ends

最佳答案

Oozie上的那些错误非常令人沮丧。根据我的经验,大多数都是由xml或参数顺序中的错字产生的。

在上一个工作流程中,您没有关闭主机标签:

<host>rootUserName@abc05.ad.abc.com<host>

应该
<host>rootUserName@abc05.ad.abc.com</host>

对于shell错误,首先我建议使用0.2版本(在此处定义:https://oozie.apache.org/docs/4.0.0/DG_ShellActionExtension.html#AE.A_Appendix_A_Shell_XML-Schema),并删除所有参数以及所有对启动操作无用的参数(不要在意结果)。

您需要使用:
<shell xmlns="uri:oozie:shell-action:0.2">

关于shell - 在oozie中计划/运行mahout命令,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30671706/

相关文章:

linux - Raspbian (jessie) 打开新的终端窗口

powershell - 某些命令(例如 IISRESET 和 ROBOCOPY)已在 Powershell v4 中停止工作

hadoop - 将 Apache Zeppelin 连接到 Hive

hadoop - 将 Mahout 模型输出导出为 Weka 输入

linux - 时区说明 - LINUX env

exception - 如何在 Oozie 中获取有关已终止作业的更具体的错误信息

linux - 将文件内容传递给 docker exec

linux - 将 bash 变量作为模式传递给 awk

ubuntu - 在hadoop-2.6.0多节点安装中无法格式化namenode

java - 使用 Knime 定义 Oozie 工作流程