We are currently using hadoop-2.8.0
on a 10 node cluster and are planning to upgrade to latest hadoop-3.0.0
.
I want to know whether there will be any issue if we use hadoop-3.0.0
with an older version of Spark and other components such as Hive, Pig and Sqoop.
Latest Hive version does not support Hadoop3.0.It seems that Hive may be established on Spark or other calculating engines in the future.