Taiwan Hadoop Forum http://forum.hadoop.tw/ |
|
HBASE整合HIVE問題 http://forum.hadoop.tw/viewtopic.php?f=4&t=38303 |
第 1 頁 (共 1 頁) |
發表人: | GGG [ 2016-01-31, 16:59 ] |
文章主題 : | HBASE整合HIVE問題 |
安裝環境: VB 5.0.4 Ubuntu 15.04 hadoop 2.5.0 hbase 0.98.5 hive 0.13.1 java-7-oracle 設置如下: 一台Master 一台Slave 分別在兩台主機下的VB運行 確定可以互相PING的到對方 問題如下: 我們參照了以下兩個網頁進行HBASE整合HIVE http://blog.csdn.net/u011523533/article ... s/50480608 http://blog.csdn.net/aaronhadoop/articl ... s/28398157 且在我們的/usr/local/hadoop/hbase/lib中沒有 hbase-0.98.5-security.jar這個檔案 因此以這些 hbase-common-0.98.5-hadoop2.jar hbase-server-0.98.5-hadoop2.jar hbase-client-0.98.5-hadoop2.jar hbase-protocol-0.98.5-hadoop2.jar htrace-core-2.04.jar netty--3.6.6.Final.jar hbase-hadoop2-compat-0.98.5-hadoop2.jar hbase-hadoop-compat-0.98.5-hadoop2.jar metrics-core-2.2.0.jar 進行取代 再以這個程式碼開啟 代碼: bin/hive -hiveconf hbase.zookeeper.quorum=Slave11,Slave12,Slave13 出現以下BUG 代碼: [main] conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead 不知道為什麼會有這樣的錯誤 煩請各位大大指點迷津QAQ 我們先忽略以上的BUG繼續往下進行hive與hbase的整合 輸入以下程式碼新建hive的數據表 代碼: create table pokes(foo int,bar string) row format delimited fields terminated by ','; 再輸入以下程式碼批量導入數據 代碼: load data local inpath '/home/yujianxin/temp/data1.txt' overwrite into table pokes; 再依照上面的網頁教學輸入以下程式碼 代碼: SET hive.hbase.bulk=true; 再輸入以下程式碼導入有分區的表 代碼: insert overwrite table hbase_hive_2 partition (day='2012-01-01') select * from pokes ;此時跑出以下的bug 代碼: Number of reduce tasks is set to 0 since there's no reduce operator java.io.IOException: java.lang.ClassNotFoundException: at org.apache.hadoop.hive.ql.io.HivePassThroughOutputFormat.createActualOF(HivePassThroughOutputFormat.java:73) at org.apache.hadoop.hive.ql.io.HivePassThroughOutputFormat.checkOutputSpecs(HivePassThroughOutputFormat.java:83) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.checkOutputSpecs(FileSinkOperator.java:951) at org.apache.hadoop.hive.ql.io.HiveOutputFormatImpl.checkOutputSpecs(HiveOutputFormatImpl.java:67) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:460) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:420) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.ClassNotFoundException: at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:278) at org.apache.hadoop.hive.ql.io.HivePassThroughOutputFormat.createActualOF(HivePassThroughOutputFormat.java:66) ... 38 more Job Submission failed with exception 'java.io.IOException(java.lang.ClassNotFoundException: )' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask 不知道原因在哪裡 跪求各位大大先進指點迷津,非常感謝 |
發表人: | jazz [ 2016-02-05, 21:03 ] |
文章主題 : | Re: HBASE整合HIVE問題 |
DEPRECATED 只是代表該參數有新的名稱,不改一般不會有很大的影響。也可照指示改用 hive.hmshandler.retry.* 至於 java.lang.ClassNotFoundException,感覺是 MapReduce 沒辦法找到某個 Java 類別,但哪一個卻又沒秀出來。 初步只能懷疑要把 hbase 的 jar 檔放到 MapReduce 的 lib 底下。 - Jazz |
發表人: | GGG [ 2016-03-17, 16:50 ] |
文章主題 : | Re: HBASE整合HIVE問題 |
謝謝 jazz 的回覆 繼續努力 |
第 1 頁 (共 1 頁) | 所有顯示的時間為 UTC + 8 小時 |
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group http://www.phpbb.com/ |