java - Job started with MapReduce gets killed .Why? -


i try several days start wordount(mapreduce) job oozie. normal(cmd: "hadoop jar *.jar mainclass input output") job start things goes fine . current oozie configuration :

  • /applicationdir/lib/wordcount.jar
  • /applicationdir/workflow.xml
  • /text-in
  • /text-out

    workflow.xml

    <action name='wordcount'>     <map-reduce>         <job-tracker>${jobtracker}</job-tracker>         <name-node>${namenode}</name-node>         <prepare>             <delete path="${outputdir}" />         </prepare>         <configuration>              <property>                 <name>mapred.job.queue.name</name>                 <value>${queuename}</value>             </property>             <property>                 <name>mapred.mapper.class</name>                 <value>hadoopjobs.wordcound.wordcountmr.map</value>             </property>             <property>                 <name>mapred.reducer.class</name>                 <value>hadoopjobs.wordcound.wordcountmr.reduce</value>             </property>             <property>                 <name>mapreduce.input.fileinputformat.inputdir</name>                 <value>${inputdir}</value>             </property>             <property>                 <name>mapreduce.output.fileoutputformat.outputdir</name>                 <value>${outputdir}</value>             </property>         </configuration>     </map-reduce>     <ok to='end'/>     <error to='kill'/> </action>  <kill name='kill'>     <message>error: [${wf:errormessage(wf:lasterrornode())}]</message> </kill>  <end name='end'/> 

job.properties

namenode=hdfs://192.168.1.110:8020     jobtracker=192.168.1.110:8050 queuename=default  oozie.wf.application.path=${namenode}/tmp/testdir/wordcount-example/applicationdir inputdir=hdfs://192.168.1.110:8020/tmp/testdir/wordcount-example/text-in outputdir=hdfs://192.168.1.110:8020/tmp/testdir/wordcount-example/text-out 

command :

oozie job -oozie http://192.168.1.110:11000/oozie/ -config job.properties -run 

result:

job gets killed

--update--

oozie log: https://docs.google.com/document/d/1bknv4dsescrqpzklhojuaryvesp3q0454ul_5_xvpdk/edit?usp=sharing

i solve downloading cloudera cdh . have hue , has nice ui , there can see errors in detail. overall , solve error when delete workflow xml following part:

            <property>                 <name>mapred.mapper.class</name>                 <value>hadoopjobs.wordcound.wordcountmr.map</value>             </property>             <property>                 <name>mapred.reducer.class</name>                 <value>hadoopjobs.wordcound.wordcountmr.reduce</value>             </property> 

Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -