We are currently using hadoop-2.8.0
on a 10 node cluster and are planning to upgrade to latest hadoop-3.0.0
.
I want to know whether there will be any issue if we use hadoop-3.0.0
with an older version of Spark and other components such as Hive, Pig and Sqoop.