So here is the setup.
Currently I have two Spark Applications initialized. I need to pass data between them (preferably through shared sparkcontext/sqlcontext so I can just query a temp table). I currently use Parquet Files to dataframe transfer, but is it possible any other way?
MasterURL points to the same SparkMaster
Start Spark via Terminal:
/opt/spark/sbin/start-master.sh;
/opt/spark/sbin/start-slave.sh spark://`hostname`:7077
Java App Setup:
JavaSparkContext context = new JavaSparkContext(conf);
//conf = setMaster(MasterURL), 6G memory, and 4 cores.
SQLContext sqlContext = new SQLContext(parentContext.sc());
Then I register an existing frame later on
//existing dataframe to temptable
df.registerTempTable("table");
and
SparkR
sc <- sparkR.init(master='MasterURL', sparkEnvir=list(spark.executor.memory='6G', spark.cores.max='4')
sqlContext <- sparkRSQL.init(sc)
# attempt to get temptable
df <- sql(sqlContext, "SELECT * FROM table"); # throws the error