最近几天我一直在尝试用Hadoop。 我在Ubuntu 12.10上运行的Hadoop伪分布式模式和成功执行一些标准的MapReduce作业。
接下来,我想开始与HBase的实验。 我已经安装了HBase的,在外壳起到了一下。 这一切都很好,所以我想通过一个简单的Java程序与HBase的实验。 我想导入以前的MapReduce作业之一的输出,并将其加载到HBase的表。 我写道,应该产生一个映射HFileOutputFormat
文件应该很容易读取到HBase的表。
现在,每当我运行程序(使用:Hadoop的罐子[编译罐子])我得到一个ClassNotFoundException
。 该计划似乎未能解决com.google.commons.primitives.Long
。 当然,我认为这只是一个依赖失踪,但这个JAR(谷歌的番石榴)在那里。
我已经尝试了很多不同的东西,但似乎无法找到解决的办法。
我连着发生的异常,最重要的类。 我将真正的赞赏,如果有人可以帮助我或者给我上哪里找一些建议。
亲切的问候,Pieterjan
错误
12/12/13 09:02:54 WARN snappy.LoadSnappy: Snappy native library not loaded
12/12/13 09:03:00 INFO mapred.JobClient: Running job: job_201212130304_0020
12/12/13 09:03:01 INFO mapred.JobClient: map 0% reduce 0%
12/12/13 09:04:07 INFO mapred.JobClient: map 100% reduce 0%
12/12/13 09:04:51 INFO mapred.JobClient: Task Id : attempt_201212130304_0020_r_000000_0,Status : FAILED
Error: java.lang.ClassNotFoundException: com.google.common.primitives.Longs
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at org.apache.hadoop.hbase.KeyValue$KVComparator.compare(KeyValue.java:1554)
at org.apache.hadoop.hbase.KeyValue$KVComparator.compare(KeyValue.java:1536)
at java.util.TreeMap.compare(TreeMap.java:1188)
at java.util.TreeMap.put(TreeMap.java:531)
at java.util.TreeSet.add(TreeSet.java:255)
at org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:63)
at org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:40)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:650)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:418)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
JAVA
文件夹:
public class TestHBaseMapper extends Mapper<LongWritable, Text, ImmutableBytesWritable, Put> {
@Override
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
//Tab delimiter \t, white space delimiter: \\s+
String[] s = value.toString().split("\t");
Put put = new Put(s[0].getBytes());
put.add("amount".getBytes(), "value".getBytes(), value.getBytes());
context.write(new ImmutableBytesWritable(Bytes.toBytes(s[0])), put);
}
工作:
public class TestHBaseRun extends Configured implements Tool {
@Override
public int run(String[] args) throws Exception {
try {
Configuration configuration = getConf();
Job hbasejob = new Job(configuration);
hbasejob.setJobName("TestHBaseJob");
hbasejob.setJarByClass(TestHBaseRun.class);
//Specifies the InputFormat and the path.
hbasejob.setInputFormatClass(TextInputFormat.class);
TextInputFormat.setInputPaths(hbasejob, new Path("/hadoopdir/user/data/output/test/"));
//Set Mapper, MapperOutputKey and MapperOutputValue classes.
hbasejob.setMapperClass(TestHBaseMapper.class);
hbasejob.setMapOutputKeyClass(ImmutableBytesWritable.class);
hbasejob.setMapOutputValueClass(Put.class);
//Specifies the OutputFormat and the path. If The path exists it's reinitialized.
//In this case HFiles, that can be imported into HBase, are produced.
hbasejob.setOutputFormatClass(HFileOutputFormat.class);
FileSystem fs = FileSystem.get(configuration);
Path outputpath = new Path("/hadoopdir/user/data/hbase/table/");
fs.delete(outputpath, true);
HFileOutputFormat.setOutputPath(hbasejob, outputpath);
//Check if table exists in HBase and creates it if necessary.
HBaseUtil util = new HBaseUtil(configuration);
if (!util.exists("test")) {
util.createTable("test", new String[]{"amount"});
}
//Reads the existing (or thus newly created) table.
Configuration hbaseconfiguration = HBaseConfiguration.create(configuration);
HTable table = new HTable(hbaseconfiguration, "test");
//Write HFiles to disk. Autoconfigures partitioner and reducer.
HFileOutputFormat.configureIncrementalLoad(hbasejob, table);
boolean success = hbasejob.waitForCompletion(true);
//Load generated files into table.
LoadIncrementalHFiles loader;
loader = new LoadIncrementalHFiles(hbaseconfiguration);
loader.doBulkLoad(outputpath, table);
return success ? 0 : 1;
} catch (Exception ex) {
System.out.println("Error: " + ex.getMessage());
}
return 1;
}