您好,登錄后才能下訂單哦!
任務(wù)要求:
//輸入文件格式
18661629496 110
13107702446 110
1234567 120
2345678 120
987654 110
2897839274 18661629496
//輸出文件格式格式
11018661629496|13107702446|987654|18661629496|13107702446|987654|
1201234567|2345678|1234567|2345678|
186616294962897839274|2897839274|
mapreduce程序編寫:
import java.io.IOException; import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; public class Test2 { enum Counter { LINESKIP,//記錄出錯的行 } public static class Map extends Mapper<LongWritable, Text, Text, Text>{ public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String line = value.toString();//讀取源數(shù)據(jù) try { //數(shù)據(jù)處理 String [] lineSplit = line.split(" ");//18661629496,110 String anum = lineSplit[0]; String bnum = lineSplit[1]; //輸出格式:110,18661629496 context.write(new Text(bnum), new Text(anum)); } catch(ArrayIndexOutOfBoundsException e) { context.getCounter(Counter.LINESKIP).increment(1);//出錯時計數(shù)器+1 return; } } } public static class Reduce extends Reducer<Text, Text, Text, Text> { public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException { String valueString; String out=""; for(Text value:values) { valueString=value.toString(); out+=valueString+"|"; } context.write(key, new Text(out)); } } public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); if (args.length != 2) { System.err.println("請配置輸入輸出路徑 "); System.exit(2); } //各種配置 Job job = new Job(conf, "telephone ");//作業(yè)名稱配置 //類配置 job.setJarByClass(Test2.class); job.setMapperClass(Map.class); job.setReducerClass(Reduce.class); //map輸出格式配置 job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(Text.class); //作業(yè)輸出格式配置 job.setOutputKeyClass(Text.class); job.setOutputValueClass(Text.class); //增加輸入輸出路徑 FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job, new Path(args[1])); //任務(wù)完成時退出 System.exit(job.waitForCompletion(true) ? 0 : 1); } }
將mapreduce程序打包為jar文件:
1.右鍵項目名稱->Export->java->jar file
2.配置jar文件存儲位置
3.選擇main calss
4.運行jar文件
[liuqingjie@master hadoop-0.20.2]$ bin/hadoop jar /home/liuqingjie/test2.jar /user/liuqingjie/in /user/liuqingjie/out
15/05/14 01:46:47 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
15/05/14 01:46:47 INFO input.FileInputFormat: Total input paths to process : 2
15/05/14 01:46:48 INFO mapred.JobClient: Running job: job_201505132004_0005
15/05/14 01:46:49 INFO mapred.JobClient: map 0% reduce 0%
15/05/14 01:46:57 INFO mapred.JobClient: map 100% reduce 0%
15/05/14 01:47:09 INFO mapred.JobClient: map 100% reduce 100%
……………………………………………………………………………………
查看結(jié)果
[liuqingjie@master hadoop-0.20.2]$ bin/hadoop dfs -cat ./out/*
cat: Source must be a file.
110 18661629496|13107702446|987654|18661629496|13107702446|987654|
120 1234567|2345678|1234567|2345678|
18661629496 2897839274|2897839274|
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報,并提供相關(guān)證據(jù),一經(jīng)查實,將立刻刪除涉嫌侵權(quán)內(nèi)容。