I need to schedule an oozie Java action which interacts with secured hbase, so I need to provide hbase credentials to the Java action. I am using a secured hortonworks 2.2 environment, my workflow XML is as below
<workflow-app xmlns="uri:oozie:workflow:0.4" name="solr-wf">
<credentials>
<credential name="hbase" type="hbase">
</credential>
</credentials>
<start to="java-node"/>
<action name="java-node" cred="hbase">
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<main-class>com.test.hbase.TestHBaseSecure</main-class>
<arg>${arg1}</arg>
</java>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
I have also modified the oozie property to include HbaseCredentials Class
oozie.credentials.credentialclasses=hcat=org.apache.oozie.action.hadoop.HCatCredentials,hbase=org.apache.oozie.action.hadoop.HbaseCredentials
But I am not able to run the job it is throwing an error, below is the stacktrace
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at org.apache.oozie.action.hadoop.HbaseCredentials.copyHbaseConfToJobConf(HbaseCredentials.java:60)
at org.apache.oozie.action.hadoop.HbaseCredentials.addtoJobConf(HbaseCredentials.java:49)
at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:1054)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:913)
at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1135)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:228)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:63)
at org.apache.oozie.command.XCommand.call(XCommand.java:281)
at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Other jobs runs fine, it's only the job with hbase interaction which is failing. I have included all the hbase jar in my lib directory, I am not able to figure out the issue.
Updated workflow.xml:
<workflow-app xmlns="uri:oozie:workflow:0.4" name="${appName}">
<credentials>
<credential name="hbase-cred" type="hbase">
<property>
<name>hbase.master.kerberos.principal</name>
<value>hbase/_HOST@ABC.COM</value>
</property>
<property>
<name>hbase.master.keytab.file</name>
<value>/etc/security/keytabs/hbase.service.keytab</value>
</property>
<property>
<name>hbase.regionserver.kerberos.principal</name>
<value>hbase/_HOST@ABC.COM</value>
</property>
<property>
<name>hbase.regionserver.keytab.file</name>
<value>/etc/security/keytabs/hbase.service.keytab</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>dev1-dn2,dev1-dn3,dev1-dn1</value>
</property>
<property>
<name>zookeeper.znode.parent</name>
<value>/hbase-secure</value>
</property>
</credential>
</credentials>
<start to="java-node" />
<action name="java-node" cred='hbase-cred'>
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<main-class>com.test.hbase.TestHBaseSecure</main-class>
</java>
<ok to="end" />
<error to="fail" />
</action>
<kill name="fail">
<message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end" />
</workflow-app>
This solution was tested on HDP2.2.8:
Copy to
/usr/hdp/current/oozie-server/oozie-server/webapps/oozie/WEB-INF/lib
the following jars:Restart Oozie server.
These "credentials" are managed by the Oozie service, not by your job.
So, if HortonWorks had done a decent job of packaging their distro...
hbase-common-*-hadoop2.jar
would have been deployed in/usr/hdp/current/oozie-client/libserver/
on installationIt is not the case for HDP2.2.4 as installed on our Prod cluster. Arghh. The damn thing is broken in that damn release. You've got to manage the Kerberos ticket all by yourself, downloading a keytab
<file>
from HDFS and creating the TGT before actually connecting to HBase. We've been there.Have a look at that post for some insights about how it can be done.