您好,登錄后才能下訂單哦!
1)啟動環境
start-all.sh
2)產看狀態
jps
0613 NameNode
10733 DataNode
3455 NodeManager
15423 Jps
11082 ResourceManager
10913 SecondaryNameNode
3)利用Eclipse編寫jar
1.編寫WordMap
public class MrMap extends Mapper<Object, Text, Text, IntWritable>{
protected void map(Object key, Text value, Context context) { String line= value.toString(); String[] words = line.split(" "); for (String str : words) { Text text=new Text(str); IntWritable num=new IntWritable(1); try { context.write(text, num); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } }; } |
2.編寫WordReduce類
public class WordReduce extends Reducer<Text, IntWritable, Text, IntWritable> { protected void reduce(Text text, Iterable<IntWritable> itrs, Context context) { int sum = 0; for (IntWritable itr : itrs) { sum = sum + itr.get(); } try { context.write(text, new IntWritable(sum)); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } }; } |
3.編寫WordCount類
public class WordCount { /** * @param args * @throws IOException * @throws InterruptedException * @throws ClassNotFoundException */ public static void main(String[] args) throws IOException { Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); Job job = null; try { job = Job.getInstance(conf); job.setJobName("wc"); job.setJarByClass(WordCount.class); job.setMapperClass(WordMap.class); job.setReducerClass(WordReduce.class); job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(IntWritable.class); FileInputFormat.addInputPath(job, new Path("/word.txt")); if (fs.exists(new Path("/out"))) { fs.delete(new Path("/out")); } FileOutputFormat.setOutputPath(job, new Path("/out")); System.exit(job.waitForCompletion(true) ? 0 : 1); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } } |
4)導出jar包
5)通過ftp上傳jar到linux目錄
6)運行jar包
hadoop jar wc.jar com.mc.WordCount / /out
7)如果map和reduce都100%,以及
表示運行成功!!
8)產看結果
hadoop fs -tail /out/part-r-00000
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。