的NoSuchMethodError而在星火运行AWS S3客户端同时javap的,否则显示(NoS

2019-10-20 11:19发布

我有我在Apache星火上运行一段代码运行时的问题。 我依赖于AWS​​ SDK将文件上传到S3 - 这是示数出了的NoSuchMethodError。 这是值得要注意,我使用的是捆绑星火依赖性的尤伯杯罐子运行我的代码时出错:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.http.impl.conn.DefaultClientConnectionOperator.<init>(Lorg/apache/http/conn/scheme/SchemeRegistry;Lorg/apache/http/conn/DnsResolver;)V
at org.apache.http.impl.conn.PoolingClientConnectionManager.createConnectionOperator(PoolingClientConnectionManager.java:140)
at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:114)
at org.apache.http.impl.conn.PoolingClientConnectionManager.<init>(PoolingClientConnectionManager.java:99)
at com.amazonaws.http.ConnectionManagerFactory.createPoolingClientConnManager(ConnectionManagerFactory.java:29)
at com.amazonaws.http.HttpClientFactory.createHttpClient(HttpClientFactory.java:97)
at com.amazonaws.http.AmazonHttpClient.<init>(AmazonHttpClient.java:165)
at com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:119)
at com.amazonaws.AmazonWebServiceClient.<init>(AmazonWebServiceClient.java:103)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:357)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:339)

然而,当我检查的方法签名的罐子,我清楚地看到它列出:

vagrant@mesos:~/installs/spark-1.0.1-bin-hadoop2$ javap -classpath /tmp/rickshaw-spark-0.0.1-SNAPSHOT.jar org.apache.http.impl.conn.DefaultClientConnectionOperator
Compiled from "DefaultClientConnectionOperator.java"
public class org.apache.http.impl.conn.DefaultClientConnectionOperator implements     org.apache.http.conn.ClientConnectionOperator {
protected final org.apache.http.conn.scheme.SchemeRegistry schemeRegistry;
protected final org.apache.http.conn.DnsResolver dnsResolver;
public  org.apache.http.impl.conn.DefaultClientConnectionOperator(org.apache.http.conn.scheme.SchemeRegistry);
public org.apache.http.impl.conn.DefaultClientConnectionOperator(org.apache.http.conn.scheme.SchemeRegistry, org.apache.http.conn.DnsResolver); <-- it exists!
public org.apache.http.conn.OperatedClientConnection createConnection();
public void openConnection(org.apache.http.conn.OperatedClientConnection, org.apache.http.HttpHost, java.net.InetAddress, org.apache.http.protocol.HttpContext, org.apache.http.params.HttpParams) throws java.io.IOException;
public void updateSecureConnection(org.apache.http.conn.OperatedClientConnection, org.apache.http.HttpHost, org.apache.http.protocol.HttpContext, org.apache.http.params.HttpParams) throws java.io.IOException;
protected void prepareSocket(java.net.Socket, org.apache.http.protocol.HttpContext, org.apache.http.params.HttpParams) throws java.io.IOException;
protected java.net.InetAddress[] resolveHostname(java.lang.String) throws java.net.UnknownHostException;

}

我查了一些火花分配其他罐子 - 似乎他们没有这个特定的方法签名。 所以我在想什么被拾起星火运行导致此问题。 罐子是建立在一个Maven项目,我排队的依赖关系,以确保正确的AWS Java SDK的依赖性是被拾起为好。

Answer 1:

星火1.0.x的分布已经包含DefaultClientConnectionOperator的不兼容版本并没有简单的方法来取代它。

我发现的唯一的解决方法是,包括自定义实现PoolingClientConnectionManager的避免使用缺少的构造函数。

更换:

return new DefaultClientConnectionOperator(schreg, this.dnsResolver);

对于:

return new DefaultClientConnectionOperator(schreg);

你需要肯定的是,你的类将被包括在内:

case PathList("org", "apache", "http", "impl", xs @ _*) => MergeStrategy.first

自PoolingClientConnectionManager: https://gist.github.com/felixgborrego/568f3460d82d9c12e23c



文章来源: NoSuchMethodError while running AWS S3 client on Spark while javap shows otherwise