Taiwan Hadoop Forum

台灣 Hadoop 技術討論區
現在的時間是 2017-04-24, 11:31

所有顯示的時間為 UTC + 8 小時




發表新文章 回覆主題  [ 4 篇文章 ] 
發表人 內容
 文章主題 : hadoop 格式化的問題
文章發表於 : 2015-07-21, 11:55 
離線

註冊時間: 2015-07-21, 11:50
文章: 2
在安裝HADOOP 在格式化的部分
跑出圖片這個畫面
請問該怎麼解決呢


user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$ bin/hdfs namenode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namen ode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"


附加檔案:
had.jpg
had.jpg [ 96.85 KiB | 被瀏覽 7832 次 ]
回頂端
 個人資料  
 
 文章主題 : Re: hadoop 格式化的問題
文章發表於 : 2015-07-21, 23:25 
離線

註冊時間: 2009-11-09, 19:52
文章: 2891
arhikydark99 寫:
在安裝HADOOP 在格式化的部分
跑出圖片這個畫面
請問該怎麼解決呢
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$ bin/hdfs namenode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"


看起來跟 Java Classpath 有關。
請問有設定 (1) HADOOP_HOME 環境變數 (2) JAVA_HOME 環境變數 嗎?

若需要更仔細的資訊,請使用以下指令,並回報結果。

代碼:
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$  bash -x bin/hdfs namenode -format


- Jazz


回頂端
 個人資料 E-mail  
 
 文章主題 : Re: hadoop 格式化的問題
文章發表於 : 2015-07-24, 10:21 
離線

註冊時間: 2015-07-21, 11:50
文章: 2
jazz 寫:
arhikydark99 寫:
在安裝HADOOP 在格式化的部分
跑出圖片這個畫面
請問該怎麼解決呢
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$ bin/hdfs namenode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"


看起來跟 Java Classpath 有關。
請問有設定 (1) HADOOP_HOME 環境變數 (2) JAVA_HOME 環境變數 嗎?

若需要更仔細的資訊,請使用以下指令,並回報結果。

代碼:
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$  bash -x bin/hdfs namenode -format


- Jazz


後來從安裝
還是一樣
我 hadoop 版本 hadoop-2.2.0.tar
JAVA版本 jdk1.6.0_45
------------------------------------------------------------------------------
user@T15-4-PC01 ~/hadoop
$ bash -x bin/hdfs namenode -format
++ which bin/hdfs
+ bin=/home/user/hadoop/bin/hdfs
++ dirname /home/user/hadoop/bin/hdfs
+ bin=/home/user/hadoop/bin
++ cd /home/user/hadoop/bin
++ pwd
+ bin=/home/user/hadoop/bin
+ DEFAULT_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
+ HADOOP_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
+ . /home/user/hadoop/bin/../libexec/hdfs-config.sh
+++ which bin/hdfs
++ bin=/home/user/hadoop/bin/hdfs
+++ dirname /home/user/hadoop/bin/hdfs
++ bin=/home/user/hadoop/bin
+++ cd /home/user/hadoop/bin
+++ pwd
++ bin=/home/user/hadoop/bin
++ DEFAULT_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
++ HADOOP_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
++ '[' -e /home/user/hadoop/bin/../libexec/hadoop-config.sh ']'
++ . /home/user/hadoop/bin/../libexec/hadoop-config.sh
+++ this=/home/user/hadoop/bin/../libexec/hadoop-config.sh
+++++ dirname -- /home/user/hadoop/bin/../libexec/hadoop-config.sh
++++ cd -P -- /home/user/hadoop/bin/../libexec
++++ pwd -P
+++ common_bin=/home/user/hadoop/libexec
++++ basename -- /home/user/hadoop/bin/../libexec/hadoop-config.sh
+++ script=hadoop-config.sh
+++ this=/home/user/hadoop/libexec/hadoop-config.sh
+++ '[' -f /home/user/hadoop/libexec/hadoop-layout.sh ']'
+++ HADOOP_COMMON_DIR=share/hadoop/common
+++ HADOOP_COMMON_LIB_JARS_DIR=share/hadoop/common/lib
+++ HADOOP_COMMON_LIB_NATIVE_DIR=lib/native
+++ HDFS_DIR=share/hadoop/hdfs
+++ HDFS_LIB_JARS_DIR=share/hadoop/hdfs/lib
+++ YARN_DIR=share/hadoop/yarn
+++ YARN_LIB_JARS_DIR=share/hadoop/yarn/lib
+++ MAPRED_DIR=share/hadoop/mapreduce
+++ MAPRED_LIB_JARS_DIR=share/hadoop/mapreduce/lib
++++ cd -P -- /home/user/hadoop/libexec/..
++++ pwd -P
+++ HADOOP_DEFAULT_PREFIX=/home/user/hadoop
+++ HADOOP_PREFIX=/home/user/hadoop
+++ export HADOOP_PREFIX
+++ '[' 2 -gt 1 ']'
+++ '[' --config = namenode ']'
+++ '[' -e /home/user/hadoop/conf/hadoop-env.sh ']'
+++ DEFAULT_CONF_DIR=etc/hadoop
+++ export HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
+++ HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
+++ [[ '' != '' ]]
+++ '[' 2 -gt 1 ']'
+++ '[' --hosts = namenode ']'
+++ '[' --hostnames = namenode ']'
+++ [[ '' != '' ]]
+++ '[' -f /home/user/hadoop/etc/hadoop/hadoop-env.sh ']'
+++ . /home/user/hadoop/etc/hadoop/hadoop-env.sh
++++ export 'JAVA_HOME=C:\Java\jdk1.6.0_45'
++++ JAVA_HOME='C:\Java\jdk1.6.0_45'
++++ export HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
++++ HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
++++ for f in '$HADOOP_HOME/contrib/capacity-scheduler/*.jar'
++++ '[' '' ']'
++++ export 'HADOOP_CLASSPATH=/contrib/capacity-scheduler/*.jar'
++++ HADOOP_CLASSPATH='/contrib/capacity-scheduler/*.jar'
++++ export 'HADOOP_OPTS= -Djava.net.preferIPv4Stack=true'
++++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true'
++++ export 'HADOOP_NAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audi t.logger=INFO,NullAppender '
++++ HADOOP_NAMENODE_OPTS='-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logge r=INFO,NullAppender '
++++ export 'HADOOP_DATANODE_OPTS=-Dhadoop.security.logger=ERROR,RFAS '
++++ HADOOP_DATANODE_OPTS='-Dhadoop.security.logger=ERROR,RFAS '
++++ export 'HADOOP_SECONDARYNAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -D hdfs.audit.logger=INFO,NullAppender '
++++ HADOOP_SECONDARYNAMENODE_OPTS='-Dhadoop.security.logger=INFO,RFAS -Dhdfs.au dit.logger=INFO,NullAppender '
++++ export 'HADOOP_CLIENT_OPTS=-Xmx512m '
++++ HADOOP_CLIENT_OPTS='-Xmx512m '
++++ export HADOOP_SECURE_DN_USER=
++++ HADOOP_SECURE_DN_USER=
++++ export HADOOP_SECURE_DN_LOG_DIR=/
++++ HADOOP_SECURE_DN_LOG_DIR=/
++++ export HADOOP_PID_DIR=
++++ HADOOP_PID_DIR=
++++ export HADOOP_SECURE_DN_PID_DIR=
++++ HADOOP_SECURE_DN_PID_DIR=
++++ export HADOOP_IDENT_STRING=user
++++ HADOOP_IDENT_STRING=user
++++ /sbin/sysctl -n net.ipv6.bindv6only
+++ bindv6only=
+++ '[' -n '' ']'
+++ export MALLOC_ARENA_MAX=4
+++ MALLOC_ARENA_MAX=4
+++ [[ -z C:\Java\jdk1.6.0_45 ]]
+++ JAVA='C:\Java\jdk1.6.0_45/bin/java'
+++ JAVA_HEAP_MAX=-Xmx1000m
+++ '[' '' '!=' '' ']'
+++ CLASSPATH=/home/user/hadoop/etc/hadoop
+++ IFS=
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/common ']'
+++ export HADOOP_COMMON_HOME=/home/user/hadoop
+++ HADOOP_COMMON_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/common/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/common/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*'
+++ '[' '' = '' ']'
+++ HADOOP_LOG_DIR=/home/user/hadoop/logs
+++ '[' '' = '' ']'
+++ HADOOP_LOGFILE=hadoop.log
+++ '[' '' = '' ']'
+++ HADOOP_POLICYFILE=hadoop-policy.xml
+++ unset IFS
+++ '[' -d /home/user/hadoop/build/native -o -d /home/user/hadoop/lib/native ']'
+++ '[' -d /home/user/hadoop/lib/native ']'
+++ '[' x '!=' x ']'
+++ JAVA_LIBRARY_PATH=/home/user/hadoop/lib/native
+++ TOOL_PATH='/home/user/hadoop/share/hadoop/tools/lib/*'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console'
+++ '[' x/home/user/hadoop/lib/native '!=' x ']'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native'
+++ export LD_LIBRARY_PATH=:/home/user/hadoop/lib/native
+++ LD_LIBRARY_PATH=:/home/user/hadoop/lib/native
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4S tack=true'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs ']'
+++ export HADOOP_HDFS_HOME=/home/user/hadoop
+++ HADOOP_HDFS_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs/webapps ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs'
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/yarn ']'
+++ export HADOOP_YARN_HOME=/home/user/hadoop
+++ HADOOP_YARN_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/yarn/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/yarn/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce ']'
+++ export HADOOP_MAPRED_HOME=/home/user/hadoop
+++ HADOOP_MAPRED_HOME=/home/user/hadoop
+++ '[' /home/user/hadoop/share/hadoop/mapreduce '!=' /home/user/hadoop/share/ha doop/yarn ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoo p/mapreduce/*'
+++ '[' '/contrib/capacity-scheduler/*.jar' '!=' '' ']'
+++ '[' '' '!=' '' ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoo p/mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ '[' 2 = 0 ']'
+ COMMAND=namenode
+ shift
+ case $COMMAND in
+ '[' namenode == datanode ']'
+ '[' namenode = namenode ']'
+ CLASS=org.apache.hadoop.hdfs.server.namenode.NameNode
+ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/hado op/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhadoo p.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user/h adoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Sta ck=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender '
+ export 'CLASSPATH=/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/ common/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/had oop/hdfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoo p/hdfs/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoo p/yarn/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/ hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/common/ lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/hdf s:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdfs/ *:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yarn/ *:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoop/ mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/hado op/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhadoo p.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user/h adoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Sta ck=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,NullAppender'
+ '[' '' = true ']'
+ exec 'C:\Java\jdk1.6.0_45/bin/java' -Dproc_namenode -Xmx1000m -Djava.net.prefe rIPv4Stack=true -Dhadoop.log.dir=/home/user/hadoop/logs -Dhadoop.log.file=hadoop .log -Dhadoop.home.dir=/home/user/hadoop -Dhadoop.id.str=user -Dhadoop.root.logg er=INFO,console -Djava.library.path=/home/user/hadoop/lib/native -Dhadoop.policy .file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger =INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,N ullAppender org.apache.hadoop.hdfs.server.namenode.NameNode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namen ode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"
---------------------------------------------
設定環境變數的圖


附加檔案:
path.jpg
path.jpg [ 37.69 KiB | 被瀏覽 7808 次 ]
回頂端
 個人資料  
 
 文章主題 : Re: hadoop 格式化的問題
文章發表於 : 2015-07-25, 01:23 
離線

註冊時間: 2009-11-09, 19:52
文章: 2891
我印象 Hadoop 2.x 以後的版本有 .cmd 的程式碼,
建議先參考一下官方文件
http://wiki.apache.org/hadoop/Hadoop2OnWindows
(我很久沒用 Windows 裝 Hadoop 了,建測試環境需要一點時間)

若不想花太多時間在環境上,您可以考慮以下幾個選項:
1. https://code.google.com/p/windoop/
2. http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.2.4-Win/index.html

我以前寫的 Hadoop4Win 太久沒維護了,
http://trac.3du.me/cloud/wiki/Hadoop4Win
除非你只是要學習 Hadoop 是什麼,
並參考 http://trac.3du.me/cloud 的一些實作教學,
不然我猜應該不是你想要用的環境。

- Jazz


回頂端
 個人資料 E-mail  
 
顯示文章 :  排序  
發表新文章 回覆主題  [ 4 篇文章 ] 

所有顯示的時間為 UTC + 8 小時


誰在線上

正在瀏覽這個版面的使用者:沒有註冊會員 和 1 位訪客


不能 在這個版面發表主題
不能 在這個版面回覆主題
不能 在這個版面編輯您的文章
不能 在這個版面刪除您的文章
不能 在這個版面上傳附加檔案

搜尋:
前往 :  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
正體中文語系由 竹貓星球 維護製作