小妹我在執行jar時遇到Exception
在網路上搜尋了蠻多類似的問題,
但還是沒辦法解決我所遇到的,
因此想請教大大,
環境是單機/偽分布配置
Hadoop2.4.1
Ubuntu14.04
以下是我遇到問題詳細內容
附加檔案:
擷取.PNG [ 61.91 KiB | 被瀏覽 6471 次 ]
以下是我要執行的範例
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class FileSystemOption {
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
FileSystem fs1 = FileSystem.get(conf);
Path dir1 = new Path("/user/hadoop/hwtwo");
Path file1 = new Path(dir1 + "/a.txt");
fs1.mkdirs(dir1);
fs1.copyFromLocalFile(new Path("/usr/local/hadoop/files/sample.txt"), file1);
FileStatus status1 = fs1.getFileStatus(file1);
System.out.println(status1.getPath());
System.out.println(status1.getPath().toUri().getPath());
System.out.println(status1.getBlockSize());
System.out.println(status1.getGroup());
System.out.println(status1.getOwner());
System.out.println(fs1.isFile(file1));
FSDataInputStream fsin = fs1.open(file1);
byte[] buff = new byte[128];
int length = 0;
while((length = fsin.read(buff,0,128)) != -1){
System.out.println(new String(buff,0,length));
}
}
}
ps.執行wordcount是成功的,
希望大大能夠協助小妹解決這個問題,
這個問題困擾我很久了
感激不盡