I am a new to NoSQL solutions and want to play with Hive. But installing HDFS/Hadoop takes a lot of resources and time (maybe without experience but I got no time to do this).
Are there ways to install and use Hive on a local machine without HDFS/Hadoop?
I would recomend you to use something like this.
http://hortonworks.com/products/hortonworks-sandbox/
Its a fully functional VM with everything you need to start right away.
yes you can run hive without hadoop 1.create your warehouse on your local system 2. give default fs as file:/// than you can run hive in local mode with out hadoop installation
In Hive-site.xml
If you are just talking about experiencing Hive before making a decision you can just use a preconfigured VM as @Maltram suggested (Hortonworks, Cloudera, IBM and others all offer such VMs)
What you should keep in mind that you will not be able to use Hive in production without Hadoop and HDFS so if it is a problem for you, you should consider alternatives to Hive
Update This answer is out-of-date : with
Hive on Spark
it is no longer necessary to havehdfs
support.Hive requires hdfs and map/reduce so you will need them. The other answer has some merit in the sense of recommending a simple / pre-configured means of getting all of the components there for you.
But the gist of it is: hive needs hadoop and m/r so in some degree you will need to deal with it.
You cant, just download Hive, and run:
Hadoop is like a core, and Hive need some library from it.