If you are using new MapReduce API, and wants to execute implement DAG with oozie workflow, then you
may face the below exception :
may face the below exception :
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.jav
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88) ... 9 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: class TestMapper not org.apache.hadoop.mapred.Mapper
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:899)
Caused by: java.lang.RuntimeException: class TestMapper not org.apache.hadoop.mapred.Mapper
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:893)
... 16 more
To solve the above issue, we need to have two properties in workflow.xml,
<
property
>
<
name
>mapred.reducer.new-api</
name
>
<
value
>true</
value
>
</
property
>
<
property
>
<
name
>mapred.mapper.new-api</
name
>
<
value
>true</
value
>
</
property
>
now replace the workflow.xml in HDFS with updated one, the issue will be resolved.....
No comments:
Post a Comment