Spark- Saving JavaRDD to Cassandra

2019-06-21 04:00发布

http://www.datastax.com/dev/blog/accessing-cassandra-from-spark-in-java

The link above shows a way to save a JavaRDD to cassandra in this way:

import static com.datastax.spark.connector.CassandraJavaUtil.*;

JavaRDD<Product> productsRDD = sc.parallelize(products);
javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products");

But the com.datastax.spark.connector.CassandraJavaUtil.* seems deprecated. The updated API should be:

import static com.datastax.spark.connector.japi.CassandraJavaUtil.*;

Can someone show me some codes to store a JavaRDD to Cassandra using the updated API above?

2条回答
欢心
2楼-- · 2019-06-21 04:32

Following the documentation, should be like this:

javaFunctions(rdd).writerBuilder("ks", "people", mapToRow(Person.class)).saveToCassandra();
查看更多
别忘想泡老子
3楼-- · 2019-06-21 04:42

replace

JavaRDD<Product> productsRDD = sc.parallelize(products); javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products »);

by

JavaRDD<Product> productsRDD = javaFunctions(sc).cassandraTable("java_api", "products", mapRowTo(Product.class));

查看更多
登录 后发表回答