I can get Phoenix working on a standalone Apache Hbase
(note, all this is for Hbase 1.0.0 on RHEL6.5)
For the Cloudera flavour of Hbase however I never get it working without it throwing Exceptions. (even tried RHEL7 minimal as en OS)
The same thing happens with Phoenix 4.4 for Hbase 1.0.
hbase(main):001:0> version
1.0.0-cdh5.4.4, rUnknown, Mon Jul 6 16:59:55 PDT 2015
stack trace:
[ec2-user@ip-172-31-60-109 phoenix-4.5.0-HBase-1.0-bin]$ bin/sqlline.py localhost:2181:/hbase
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost:2181:/hbase none none org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:localhost:2181:/hbase
15/08/06 03:10:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/06 03:10:26 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
15/08/06 03:10:27 WARN ipc.CoprocessorRpcChannel: Call failed on IOException
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1269)
at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11619)
at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1746)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1728)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31447)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:966)
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1042)
at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1216)
... 10 more
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:313)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1609)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95)
at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.createTable(MetaDataProtos.java:11799)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1273)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1261)
at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1737)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
at ...
New Apache Phoenix 4.5.2 Package from Cloudera Labs
Cloudera doesn't officially support Apache phoenix, it's still in cloudera Labs, so you cannot find any Cloudera Pheonix tar.gz files in cloudera repository, The only place where you can find Phoenix in Cloudera repository is in the parcel repository, However parcel can be used only if you install through cloudera manager, latest available version of cloudera Phoenix is 4.3.0.
If you wanted to execute Phoenix 4.4 or 4.5 version on Cloudera Hadoop distribution, you need to re-build phoenix libraries using CDH dependency jars. You cannot simply use apache Phoenix tar.gz
Here is the steps.
Recently I found that Andrew Purtell has done a tremendous work to make Phoenix compatible with CDH version. The same is available in the below link github page. Download the appropriate branch from the below github link. This saves your time.
https://github.com/chiastic-security/phoenix-for-cloudera/branches
Rebuild the source code using CDH dependency jars - Update pom.xml and 2 source files as follows(My CDH version is 5.4.2)
Extract phoenix-4.5.0-HBase-1.0-bin.tar.gz and replace the below Phoenix Jars with new Jars
Replace
phoenix-4.5.0-HBase-1.0-server.jar
andphoenix-core-4.5.0-HBase-1.0.jar
in hbase lib location and restart hbase. (In 4.7 only phoenix-4.7.0-cdh5.X.1-server.jar to be copied to hbase lib)Due to some dependency issues phoenix-pig is not handled, this is just a workaround.