您好,登錄后才能下訂單哦!
這篇文章主要為大家展示了“hadoop中如何實現KeyValueTextInputFormat”,內容簡而易懂,條理清晰,希望能夠幫助大家解決疑惑,下面讓小編帶領大家一起研究并學習一下“hadoop中如何實現KeyValueTextInputFormat”這篇文章吧。
package com.test; import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configured; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.input.KeyValueLineRecordReader; import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; import org.apache.hadoop.util.Tool; import org.apache.hadoop.util.ToolRunner; /** * hello jim * hello tim * * 最后輸出 * hello 1 * jim 1 * hello 1 * tim 1 */ public class WordCountKeyValue extends Configured implements Tool { public static class Map extends Mapper<Text, Text, Text, IntWritable> { /** * key hello * value jim */ public void map(Text key, Text value, Context context) throws IOException, InterruptedException { context.write(key, new IntWritable(1)); context.write(value, new IntWritable(1)); } } public int run(String[] args) throws IOException, InterruptedException, ClassNotFoundException { Configuration conf = this.getConf(); //指定KeyValueTextInputFormat分割符,默認分割符是\t //conf.set("mapreduce.input.keyvaluelinerecordreader.key.value.separator", "\t"); conf.set(KeyValueLineRecordReader.KEY_VALUE_SEPERATOR, "\t"); Job job = new Job(conf); job.setJobName(WordCountKeyValue.class.getSimpleName()); job.setJarByClass(WordCountKeyValue.class); FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job, new Path(args[1])); job.setNumReduceTasks(0); job.setMapperClass(Map.class); job.setInputFormatClass(KeyValueTextInputFormat.class); job.setOutputFormatClass(TextOutputFormat.class); job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(IntWritable.class); job.waitForCompletion(true); return job.isSuccessful()?0:1; } public static void main(String[] args) throws Exception { int exit = ToolRunner.run(new WordCount(), args); System.exit(exit); } }
以上是“hadoop中如何實現KeyValueTextInputFormat”這篇文章的所有內容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內容對大家有所幫助,如果還想學習更多知識,歡迎關注億速云行業資訊頻道!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。