Where does Hive store files in HDFS?

2019-01-16 03:26发布

I'd like to know how to find the mapping between Hive tables and the actual HDFS files (or rather, directories) that they represent. I need to access the table files directly.

Where does Hive store its files in HDFS?

标签: hadoop hive hdfs
10条回答
混吃等死
2楼-- · 2019-01-16 04:04

describe formatted <table_name>; inside hive shell.

Notice the "Location" value that shows the location of the table.

查看更多
走好不送
3楼-- · 2019-01-16 04:04

Another way to check where a specific table is stored would be execute this query on the hive interactive interface:

show create table table_name;

where table_name is the name of the subject table.

An example for the above query on 'customers' table would be something like this:

CREATE TABLE `customers`(
  `id` string, 
  `name` string)
COMMENT 'Imported by sqoop on 2016/03/01 13:01:49'
ROW FORMAT DELIMITED 
  FIELDS TERMINATED BY ',' 
  LINES TERMINATED BY '\n' 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://quickstart.cloudera:8020/user/hive/warehouse/
   sqoop_workspace.db/customers'
TBLPROPERTIES (
  'COLUMN_STATS_ACCURATE'='true', 
  'numFiles'='4', 
  'totalSize'='77', 
  'transient_lastDdlTime'='1456866115')

LOCATION in the example above is where you should focus on. That is your hdfs location for hive warehouse.

Don't forget to like if you like this solution. Cheers!

查看更多
beautiful°
4楼-- · 2019-01-16 04:07

If you look at the hive-site.xml file you will see something like this

<property>
   <name>hive.metastore.warehouse.dir</name>
   <value>/usr/hive/warehouse </value>
   <description>location of the warehouse directory</description>
 </property>

/usr/hive/warehouse is the default location for all managed tables. External tables may be stored at a different location.

describe formatted <table_name> is the hive shell command which can be use more generally to find the location of data pertaining to a hive table.

查看更多
乱世女痞
5楼-- · 2019-01-16 04:10

In Hive terminal type:

hive> set hive.metastore.warehouse.dir;

(it will print the path)

查看更多
做个烂人
6楼-- · 2019-01-16 04:11

The location they are stored on the HDFS is fairly easy to figure out once you know where to look. :)

If you go to http://NAMENODE_MACHINE_NAME:50070/ in your browser it should take you to a page with a Browse the filesystem link.

In the $HIVE_HOME/conf directory there is the hive-default.xml and/or hive-site.xml which has the hive.metastore.warehouse.dir property. That value is where you will want to navigate to after clicking the Browse the filesystem link.

In mine, it's /usr/hive/warehouse. Once I navigate to that location, I see the names of my tables. Clicking on a table name (which is just a folder) will then expose the partitions of the table. In my case, I currently only have it partitioned on date. When I click on the folder at this level, I will then see files (more partitioning will have more levels). These files are where the data is actually stored on the HDFS.

I have not attempted to access these files directly, I'm assuming it can be done. I would take GREAT care if you are thinking about editing them. :) For me - I'd figure out a way to do what I need to without direct access to the Hive data on the disk. If you need access to raw data, you can use a Hive query and output the result to a file. These will have the exact same structure (divider between columns, ect) as the files on the HDFS. I do queries like this all the time and convert them to CSVs.

The section about how to write data from queries to disk is https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML#LanguageManualDML-Writingdataintothefilesystemfromqueries

UPDATE

Since Hadoop 3.0.0 - Alpha 1 there is a change in the default port numbers. NAMENODE_MACHINE_NAME:50070 changes to NAMENODE_MACHINE_NAME:9870. Use the latter if you are running on Hadoop 3.x. The full list of port changes are described in HDFS-9427

查看更多
ら.Afraid
7楼-- · 2019-01-16 04:12

Hive tables may not necessarily be stored in a warehouse (since you can create tables located anywhere on the HDFS).

You should use DESCRIBE FORMATTED <table_name> command.

hive -S -e "describe formatted <table_name> ;" | grep 'Location' | awk '{ print $NF }'

Please note that partitions may be stored in different places and to get the location of the alpha=foo/beta=bar partition you'd have to add partition(alpha='foo',beta='bar') after <table_name>.

查看更多
登录 后发表回答