您好,登錄后才能下訂單哦!
mapper類的代碼:
實(shí)現(xiàn)Mapper類的方法
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
public class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable> {
// constant
private static final int MISSING = 9999;
// map function
@Override
protected void map(LongWritable key, Text value, Mapper<LongWritable, Text, Text, IntWritable>.Context context)
throws IOException, InterruptedException {
// per line
String line = value.toString();
// get year
String year = line.substring(15, 19);
// get airtemp
int airTemperature;
if (line.charAt(87) == '+') {
airTemperature = Integer.parseInt(line.substring(88, 92));
} else {
airTemperature = Integer.parseInt(line.substring(87, 92));
}
// valid air temp data
String quality = line.substring(92, 93);
if (airTemperature != MISSING && quality.matches("[01459]")) {
context.write(new Text(year), new IntWritable(airTemperature));
}
}
}
Reduce類的代碼:
實(shí)現(xiàn)Reducer類的方法
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
public class MyReducer extends Reducer<Text, IntWritable, Text, IntWritable> {
@Override
protected void reduce(Text key, Iterable<IntWritable> values, Context context)
throws IOException, InterruptedException {
// max
int maxValue = Integer.MIN_VALUE;
// for
for (IntWritable value : values) {
maxValue = Math.max(maxValue, value.get());
}
// output
context.write(key, new IntWritable(maxValue));
}
}
主方法的代碼:
import mapper類實(shí)現(xiàn).MyMapper;
import reducer類實(shí)現(xiàn).MyReducer;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class MyMapperApp {
/**
* @param args
* @throws Exception
* @throws IllegalArgumentException
*/
public static void main(String[] args) throws IllegalArgumentException, Exception {
// new job
Job job = Job.getInstance();
// find jar by ClassName
job.setJarByClass(MyMapper.class);
// job name
job.setJobName("Max temperature");
FileInputFormat.addInputPath(job, new Path("file:///mnt/hgfs/test-ncdc-data"));
FileOutputFormat.setOutputPath(job, new Path("file:///home/hadoop/mr/"));
job.setMapperClass(MyMapper.class);
job.setReducerClass(MyReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。