ERROR hive.HiveConfig: Could not load org.apache.h

2020-07-23 06:55发布

问题:

I am trying to import data from sqoop to hive

MySQL

use sample;

create table forhive(   id int auto_increment,
    firstname varchar(36),
    lastname varchar(36),
    primary key(id)
    );    

insert into  forhive(firstname, lastname) values("sample","singh");

select * from forhive;

1 abhay agrawal

2 vijay sharma

3 sample singh

This is the Sqoop command I'm using (version 1.4.7)

sqoop import --connect jdbc:mysql://********:3306/sample 

--table forhive --split-by id --columns id,firstname,lastname  

--target-dir /home/programmeur_v/forhive 

--hive-import --create-hive-table --hive-table sqp.forhive --username vaibhav -P

This is the error I'm getting

Error Log

18/08/02 19:19:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7

Enter password:

18/08/02 19:19:55 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override

18/08/02 19:19:55 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.

18/08/02 19:19:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

18/08/02 19:19:55 INFO tool.CodeGenTool: Beginning code generation

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/programmeur_v/softwares/hadoop-2.9.1

Note: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details. 18/08/02 19:19:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.jar

18/08/02 19:19:58 WARN manager.MySQLManager: It looks like you are importing from mysql.

18/08/02 19:19:58 WARN manager.MySQLManager: This transfer can be faster! Use the --direct

18/08/02 19:19:58 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.

18/08/02 19:19:58 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)

18/08/02 19:19:58 INFO mapreduce.ImportJobBase: Beginning import of forhive

18/08/02 19:19:58 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar

18/08/02 19:19:59 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

18/08/02 19:19:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

18/08/02 19:20:02 INFO db.DBInputFormat: Using read commited transaction isolation

18/08/02 19:20:02 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(id), MAX(id) FROM forhive

18/08/02 19:20:02 INFO db.IntegerSplitter: Split size: 0; Num splits: 4 from: 1 to: 3

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: number of splits:3

18/08/02 19:20:02 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabl ed

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533231535061_0006

18/08/02 19:20:03 INFO impl.YarnClientImpl: Submitted application application_1533231535061_0006

18/08/02 19:20:03 INFO mapreduce.Job: The url to track the job: http://instance-1:8088/proxy/application_1533231535061_0006/

18/08/02 19:20:03 INFO mapreduce.Job: Running job: job_1533231535061_0006

18/08/02 19:20:11 INFO mapreduce.Job: Job job_1533231535061_0006 running in uber mode : false

18/08/02 19:20:11 INFO mapreduce.Job: map 0% reduce 0%

18/08/02 19:20:21 INFO mapreduce.Job: map 33% reduce 0%

18/08/02 19:20:24 INFO mapreduce.Job: map 100% reduce 0%

18/08/02 19:20:25 INFO mapreduce.Job: Job job_1533231535061_0006 completed successfully

18/08/02 19:20:25 INFO mapreduce.Job: Counters: 31

        File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=622830
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=295
        HDFS: Number of bytes written=48
        HDFS: Number of read operations=12
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=6
        Job Counters 
        Killed map tasks=1
        Launched map tasks=3
        Other local map tasks=3
        Total time spent by all maps in occupied slots (ms)=27404
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=27404
        Total vcore-milliseconds taken by all map tasks=27404
        Total megabyte-milliseconds taken by all map tasks=28061696
        Map-Reduce Framework
        Map input records=3
        Map output records=3
        Input split bytes=295
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=671
        CPU time spent (ms)=4210
        Physical memory (bytes) snapshot=616452096
        Virtual memory (bytes) snapshot=5963145216
        Total committed heap usage (bytes)=350224384
        File Input Format Counters 
        Bytes Read=0
        File Output Format Counters 
        Bytes Written=48

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Transferred 48 bytes in 25.828 seconds (1.8584 bytes/sec)

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Retrieved 3 records.

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table forhive

18/08/02 19:20:25 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:20:25 INFO hive.HiveImport: Loading uploaded data into Hive

18/08/02 19:20:25 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

18/08/02 19:20:25 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50) at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392) at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44) ... 12 more

After I did google for the same error I added HIVE_CONF_DIR also to my bashrc

export HIVE_HOME=/home/programmeur_v/softwares/apache-hive-1.2.2-bin

export HIVE_CONF_DIR=/home/programmeur_v/softwares/apache-hive-1.2.2-bin/conf

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$HIVE_CONF_DIR

All my Hadoop services are also up and running.

6976 NameNode

7286 SecondaryNameNode

7559 NodeManager

7448 ResourceManager

8522 DataNode

14587 Jps

I'm just unable to figure out what mistake I'm making here. Please guide!

回答1:

Download the file "hive-common-0.10.0.jar" by googling. Place this in "sqoop/lib" folder. This solution worked for me.



回答2:

You need to download the file hive-common-0.10.0.jar and copy it to $SQOOP_HOME/lib folder.



回答3:

edit your .bash_profile ,then add HADOOP_CLASSPATH

vim ~/.bash_profile

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*

source ~/.bash_profile



回答4:

Go to $HIVE_HOME/lib directory using cd $HIVE_HOME/lib

then copy hive-common-x.x.x.jar and paste it in $SQOOP_HOME/lib using

cp hive-common-x.x.x.jar $SQOOP_HOME/lib



回答5:

I got the same issue when I tried to import data from MySQL to Hive with the following command:

sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password z*****3 --table users -m 1 --hive-home /opt/hive --hive-import --hive-overwrite

Finally, these environment variables made it work perfectly.

export HIVE_HOME=/opt/hive
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*
export HIVE_CONF_DIR=$HIVE_HOME/conf