jazz 寫:
arhikydark99 寫:
在安裝HADOOP 在格式化的部分
跑出圖片這個畫面
請問該怎麼解決呢
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$ bin/hdfs namenode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namenode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"
看起來跟 Java Classpath 有關。
請問有設定 (1) HADOOP_HOME 環境變數 (2) JAVA_HOME 環境變數 嗎?
若需要更仔細的資訊,請使用以下指令,並回報結果。
代碼:
user@T15-4-PC01 /cygdrive/c/hadoop/deploy/hadoop-2.4.1
$ bash -x bin/hdfs namenode -format
- Jazz
後來從安裝
還是一樣
我 hadoop 版本 hadoop-2.2.0.tar
JAVA版本 jdk1.6.0_45
------------------------------------------------------------------------------
user@T15-4-PC01 ~/hadoop
$ bash -x bin/hdfs namenode -format
++ which bin/hdfs
+ bin=/home/user/hadoop/bin/hdfs
++ dirname /home/user/hadoop/bin/hdfs
+ bin=/home/user/hadoop/bin
++ cd /home/user/hadoop/bin
++ pwd
+ bin=/home/user/hadoop/bin
+ DEFAULT_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
+ HADOOP_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
+ . /home/user/hadoop/bin/../libexec/hdfs-config.sh
+++ which bin/hdfs
++ bin=/home/user/hadoop/bin/hdfs
+++ dirname /home/user/hadoop/bin/hdfs
++ bin=/home/user/hadoop/bin
+++ cd /home/user/hadoop/bin
+++ pwd
++ bin=/home/user/hadoop/bin
++ DEFAULT_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
++ HADOOP_LIBEXEC_DIR=/home/user/hadoop/bin/../libexec
++ '[' -e /home/user/hadoop/bin/../libexec/hadoop-config.sh ']'
++ . /home/user/hadoop/bin/../libexec/hadoop-config.sh
+++ this=/home/user/hadoop/bin/../libexec/hadoop-config.sh
+++++ dirname -- /home/user/hadoop/bin/../libexec/hadoop-config.sh
++++ cd -P -- /home/user/hadoop/bin/../libexec
++++ pwd -P
+++ common_bin=/home/user/hadoop/libexec
++++ basename -- /home/user/hadoop/bin/../libexec/hadoop-config.sh
+++ script=hadoop-config.sh
+++ this=/home/user/hadoop/libexec/hadoop-config.sh
+++ '[' -f /home/user/hadoop/libexec/hadoop-layout.sh ']'
+++ HADOOP_COMMON_DIR=share/hadoop/common
+++ HADOOP_COMMON_LIB_JARS_DIR=share/hadoop/common/lib
+++ HADOOP_COMMON_LIB_NATIVE_DIR=lib/native
+++ HDFS_DIR=share/hadoop/hdfs
+++ HDFS_LIB_JARS_DIR=share/hadoop/hdfs/lib
+++ YARN_DIR=share/hadoop/yarn
+++ YARN_LIB_JARS_DIR=share/hadoop/yarn/lib
+++ MAPRED_DIR=share/hadoop/mapreduce
+++ MAPRED_LIB_JARS_DIR=share/hadoop/mapreduce/lib
++++ cd -P -- /home/user/hadoop/libexec/..
++++ pwd -P
+++ HADOOP_DEFAULT_PREFIX=/home/user/hadoop
+++ HADOOP_PREFIX=/home/user/hadoop
+++ export HADOOP_PREFIX
+++ '[' 2 -gt 1 ']'
+++ '[' --config = namenode ']'
+++ '[' -e /home/user/hadoop/conf/hadoop-env.sh ']'
+++ DEFAULT_CONF_DIR=etc/hadoop
+++ export HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
+++ HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
+++ [[ '' != '' ]]
+++ '[' 2 -gt 1 ']'
+++ '[' --hosts = namenode ']'
+++ '[' --hostnames = namenode ']'
+++ [[ '' != '' ]]
+++ '[' -f /home/user/hadoop/etc/hadoop/hadoop-env.sh ']'
+++ . /home/user/hadoop/etc/hadoop/hadoop-env.sh
++++ export 'JAVA_HOME=C:\Java\jdk1.6.0_45'
++++ JAVA_HOME='C:\Java\jdk1.6.0_45'
++++ export HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
++++ HADOOP_CONF_DIR=/home/user/hadoop/etc/hadoop
++++ for f in '$HADOOP_HOME/contrib/capacity-scheduler/*.jar'
++++ '[' '' ']'
++++ export 'HADOOP_CLASSPATH=/contrib/capacity-scheduler/*.jar'
++++ HADOOP_CLASSPATH='/contrib/capacity-scheduler/*.jar'
++++ export 'HADOOP_OPTS= -Djava.net.preferIPv4Stack=true'
++++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true'
++++ export 'HADOOP_NAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audi t.logger=INFO,NullAppender '
++++ HADOOP_NAMENODE_OPTS='-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logge r=INFO,NullAppender '
++++ export 'HADOOP_DATANODE_OPTS=-Dhadoop.security.logger=ERROR,RFAS '
++++ HADOOP_DATANODE_OPTS='-Dhadoop.security.logger=ERROR,RFAS '
++++ export 'HADOOP_SECONDARYNAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -D hdfs.audit.logger=INFO,NullAppender '
++++ HADOOP_SECONDARYNAMENODE_OPTS='-Dhadoop.security.logger=INFO,RFAS -Dhdfs.au dit.logger=INFO,NullAppender '
++++ export 'HADOOP_CLIENT_OPTS=-Xmx512m '
++++ HADOOP_CLIENT_OPTS='-Xmx512m '
++++ export HADOOP_SECURE_DN_USER=
++++ HADOOP_SECURE_DN_USER=
++++ export HADOOP_SECURE_DN_LOG_DIR=/
++++ HADOOP_SECURE_DN_LOG_DIR=/
++++ export HADOOP_PID_DIR=
++++ HADOOP_PID_DIR=
++++ export HADOOP_SECURE_DN_PID_DIR=
++++ HADOOP_SECURE_DN_PID_DIR=
++++ export HADOOP_IDENT_STRING=user
++++ HADOOP_IDENT_STRING=user
++++ /sbin/sysctl -n net.ipv6.bindv6only
+++ bindv6only=
+++ '[' -n '' ']'
+++ export MALLOC_ARENA_MAX=4
+++ MALLOC_ARENA_MAX=4
+++ [[ -z C:\Java\jdk1.6.0_45 ]]
+++ JAVA='C:\Java\jdk1.6.0_45/bin/java'
+++ JAVA_HEAP_MAX=-Xmx1000m
+++ '[' '' '!=' '' ']'
+++ CLASSPATH=/home/user/hadoop/etc/hadoop
+++ IFS=
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/common ']'
+++ export HADOOP_COMMON_HOME=/home/user/hadoop
+++ HADOOP_COMMON_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/common/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/common/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*'
+++ '[' '' = '' ']'
+++ HADOOP_LOG_DIR=/home/user/hadoop/logs
+++ '[' '' = '' ']'
+++ HADOOP_LOGFILE=hadoop.log
+++ '[' '' = '' ']'
+++ HADOOP_POLICYFILE=hadoop-policy.xml
+++ unset IFS
+++ '[' -d /home/user/hadoop/build/native -o -d /home/user/hadoop/lib/native ']'
+++ '[' -d /home/user/hadoop/lib/native ']'
+++ '[' x '!=' x ']'
+++ JAVA_LIBRARY_PATH=/home/user/hadoop/lib/native
+++ TOOL_PATH='/home/user/hadoop/share/hadoop/tools/lib/*'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console'
+++ '[' x/home/user/hadoop/lib/native '!=' x ']'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native'
+++ export LD_LIBRARY_PATH=:/home/user/hadoop/lib/native
+++ LD_LIBRARY_PATH=:/home/user/hadoop/lib/native
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml'
+++ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/ha doop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhad oop.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user /hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4S tack=true'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs ']'
+++ export HADOOP_HDFS_HOME=/home/user/hadoop
+++ HADOOP_HDFS_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs/webapps ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs'
+++ '[' -d /home/user/hadoop/share/hadoop/hdfs/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/yarn ']'
+++ export HADOOP_YARN_HOME=/home/user/hadoop
+++ HADOOP_YARN_HOME=/home/user/hadoop
+++ '[' -d /home/user/hadoop/share/hadoop/yarn/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/yarn/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*'
+++ '[' '' = '' ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce ']'
+++ export HADOOP_MAPRED_HOME=/home/user/hadoop
+++ HADOOP_MAPRED_HOME=/home/user/hadoop
+++ '[' /home/user/hadoop/share/hadoop/mapreduce '!=' /home/user/hadoop/share/ha doop/yarn ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce/webapps ']'
+++ '[' -d /home/user/hadoop/share/hadoop/mapreduce/lib ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoo p/mapreduce/*'
+++ '[' '/contrib/capacity-scheduler/*.jar' '!=' '' ']'
+++ '[' '' '!=' '' ']'
+++ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/commo n/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/h dfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdf s/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yar n/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoo p/mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ '[' 2 = 0 ']'
+ COMMAND=namenode
+ shift
+ case $COMMAND in
+ '[' namenode == datanode ']'
+ '[' namenode = namenode ']'
+ CLASS=org.apache.hadoop.hdfs.server.namenode.NameNode
+ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/hado op/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhadoo p.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user/h adoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Sta ck=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender '
+ export 'CLASSPATH=/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/ common/lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/had oop/hdfs:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoo p/hdfs/*:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoo p/yarn/*:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/ hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ CLASSPATH='/home/user/hadoop/etc/hadoop:/home/user/hadoop/share/hadoop/common/ lib/*:/home/user/hadoop/share/hadoop/common/*:/home/user/hadoop/share/hadoop/hdf s:/home/user/hadoop/share/hadoop/hdfs/lib/*:/home/user/hadoop/share/hadoop/hdfs/ *:/home/user/hadoop/share/hadoop/yarn/lib/*:/home/user/hadoop/share/hadoop/yarn/ *:/home/user/hadoop/share/hadoop/mapreduce/lib/*:/home/user/hadoop/share/hadoop/ mapreduce/*:/contrib/capacity-scheduler/*.jar'
+ HADOOP_OPTS=' -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/user/hado op/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/user/hadoop -Dhadoo p.id.str=user -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/user/h adoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Sta ck=true -Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,NullAppender'
+ '[' '' = true ']'
+ exec 'C:\Java\jdk1.6.0_45/bin/java' -Dproc_namenode -Xmx1000m -Djava.net.prefe rIPv4Stack=true -Dhadoop.log.dir=/home/user/hadoop/logs -Dhadoop.log.file=hadoop .log -Dhadoop.home.dir=/home/user/hadoop -Dhadoop.id.str=user -Dhadoop.root.logg er=INFO,console -Djava.library.path=/home/user/hadoop/lib/native -Dhadoop.policy .file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger =INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender -Dhadoop.security.logger=INFO,N ullAppender org.apache.hadoop.hdfs.server.namenode.NameNode -format
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/NameNode
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hdfs.server.namen ode.NameNode
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.hdfs.server.namenode.NameNode. Program will exit.
Exception in thread "main"
---------------------------------------------
設定環境變數的圖