可以在linux環境下運行,但是eclipse運行報錯,eclipse可以連接Hadoop集群,DFS下可以看見集群的目錄,錯誤如下:
代碼:
Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.1.2:9000/user/keyan/newout, expected: hdfs://Master.Hadoop:9000
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
at org.apache.hadoop.hdfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:99)
at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:155)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:453)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:648)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:122)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:770)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:79)
路徑設置如下:
代碼:
Configuration conf = new Configuration();
conf.set("mapred.job.tracker", "192.168.1.2:9001");
String[] ars=new String[]{"hdfs://192.168.1.2:9000/user/keyan/input","hdfs://192.168.1.2:9000/user/keyan/newout"};
String[] otherArgs = new GenericOptionsParser(conf, ars).getRemainingArgs();
求大神們指點啊,跪求了