Using Phoenix with Cloudera Hbase (installed from

2020-05-20 08:45发布

I can get Phoenix working on a standalone Apache Hbase

(note, all this is for Hbase 1.0.0 on RHEL6.5)

For the Cloudera flavour of Hbase however I never get it working without it throwing Exceptions. (even tried RHEL7 minimal as en OS)

The same thing happens with Phoenix 4.4 for Hbase 1.0.

hbase(main):001:0> version
1.0.0-cdh5.4.4, rUnknown, Mon Jul  6 16:59:55 PDT 2015

stack trace:

    [ec2-user@ip-172-31-60-109 phoenix-4.5.0-HBase-1.0-bin]$ bin/sqlline.py localhost:2181:/hbase
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect jdbc:phoenix:localhost:2181:/hbase none none org.apache.phoenix.jdbc.PhoenixDriver
Connecting to jdbc:phoenix:localhost:2181:/hbase
15/08/06 03:10:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/06 03:10:26 WARN impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties
15/08/06 03:10:27 WARN ipc.CoprocessorRpcChannel: Call failed on IOException
org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
    at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1269)
    at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:11619)
    at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7054)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1746)
    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1728)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31447)
    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2035)
    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
    at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
    at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildDeletedTable(MetaDataEndpointImpl.java:966)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1042)
    at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1216)
    ... 10 more

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:313)
    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1609)
    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92)
    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95)
    at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
    at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.createTable(MetaDataProtos.java:11799)
    at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1273)
    at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1261)
    at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1737)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM.CATALOG: org.apache.hadoop.hbase.client.Scan.setRaw(Z)Lorg/apache/hadoop/hbase/client/Scan;
    at ... 

2条回答
霸刀☆藐视天下
3楼-- · 2020-05-20 09:23

Cloudera doesn't officially support Apache phoenix, it's still in cloudera Labs, so you cannot find any Cloudera Pheonix tar.gz files in cloudera repository, The only place where you can find Phoenix in Cloudera repository is in the parcel repository, However parcel can be used only if you install through cloudera manager, latest available version of cloudera Phoenix is 4.3.0.

If you wanted to execute Phoenix 4.4 or 4.5 version on Cloudera Hadoop distribution, you need to re-build phoenix libraries using CDH dependency jars. You cannot simply use apache Phoenix tar.gz

Here is the steps.

Recently I found that Andrew Purtell has done a tremendous work to make Phoenix compatible with CDH version. The same is available in the below link github page. Download the appropriate branch from the below github link. This saves your time.

https://github.com/chiastic-security/phoenix-for-cloudera/branches

  • Download Apache phoenix 4.5 source from Apache repository. (Skip this step if you are downloading from the above github page)

Rebuild the source code using CDH dependency jars - Update pom.xml and 2 source files as follows(My CDH version is 5.4.2)

[h4ck3r@host1 phoenix]$ diff phoenix-4.5_Updated/phoenix-4.5.0-HBase-1.0-src/pom.xml  phoenix-4.5_Orig/phoenix-4.5.0-HBase-1.0-src/pom.xml
28c28
< <!--    <module>phoenix-pig</module> -->
---
>     <module>phoenix-pig</module>
37a38,41
>       <id>apache release</id>
>       <url>https://repository.apache.org/content/repositories/releases/</url>
>     </repository>
>     <repository>
42,43c46,50
<       <id>cloudera</id>
<       <url>https://repository.cloudera.com/artifactory/cloudera-repos</url>
---
>       <id>apache snapshot</id>
>       <url>https://repository.apache.org/content/repositories/snapshots/</url>
>       <snapshots>
>         <enabled>true</enabled>
>       </snapshots>
45d51
<
54d59
<
77,81c82,83
<     <hbase.version>1.0.0-cdh5.4.2</hbase.version>
<     <hadoop-two.version>2.6.0-cdh5.4.2</hadoop-two.version>
/<     <hadoop.version>2.6.0-cdh5.4.2</hadoop.version>
<     <pig.version>0.12.0</pig.version>
<     <flume.version>1.5.0-cdh5.4.2</flume.version>
---
>     <hbase.version>1.0.1</hbase.version>
>     <hadoop-two.version>2.5.1</hadoop-two.version>
84a87,88
>     <hadoop.version>2.5.1</hadoop.version>
>     <pig.version>0.13.0</pig.version>
97a102
>     <flume.version>1.4.0</flume.version>
449,450c454
<
<   <dependency>
---
>       <dependency>
454c458
<       </dependency>
---
>       </dependency>

[h4ck3r@host1 phoenix]$ diff phoenix-4.5_Updated/phoenix-4.5.0-HBase-1.0-src/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/LocalIndexMerger.java  phoenix-4.5_Orig/phoenix-4.5.0-HBase-1.0-src/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/LocalIndexMerger.java
84c84
<                     rss.getServerName(), metaEntries,1);
---
>                     rss.getServerName(), metaEntries);

[h4ck3r@host1 phoenix]$ diff phoenix-4.5_Updated/phoenix-4.5.0-HBase-1.0-src/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/IndexSplitTransaction.java phoenix-4.5_Orig/phoenix-4.5.0-HBase-1.0-src/phoenix-core/src/main/java/org/apache/hadoop/hbase/regionserver/IndexSplitTransaction.java
291c291
<                 daughterRegions.getSecond().getRegionInfo(), server.getServerName(),1);
---
>                 daughterRegions.getSecond().getRegionInfo(), server.getServerName());
978c978
< }
---
> }
\ No newline at end of file
  • Above build will create new Jars under target directory of each sub component.
  • Download Apache phoenix 4.5 binary from Apache repository
  • Extract phoenix-4.5.0-HBase-1.0-bin.tar.gz and replace the below Phoenix Jars with new Jars

    • phoenix-4.5.0-HBase-1.0-client.jar
    • phoenix-4.5.0-HBase-1.0-server-without-antlr.jar
    • phoenix-4.5.0-HBase-1.0-client-minimal.jar
    • phoenix-assembly-4.5.0-HBase-1.0-tests.jar
    • phoenix-4.5.0-HBase-1.0-client-without-hbase.jar
    • phoenix-core-4.5.0-HBase-1.0.jar
    • phoenix-4.5.0-HBase-1.0-server.jar
  • Replace phoenix-4.5.0-HBase-1.0-server.jar and phoenix-core-4.5.0-HBase-1.0.jar in hbase lib location and restart hbase. (In 4.7 only phoenix-4.7.0-cdh5.X.1-server.jar to be copied to hbase lib)

  • Execute phoenix command from the new updated directory.

Due to some dependency issues phoenix-pig is not handled, this is just a workaround.

查看更多
登录 后发表回答