I am getting JAVA Heap space error in my reducer phase .I have used 41 reducer in my application and also Custom Partitioner class . Below is my reducer code that throws below error .
17/02/12 05:26:45 INFO mapreduce.Job: map 98% reduce 0%
17/02/12 05:28:02 INFO mapreduce.Job: map 100% reduce 0%
17/02/12 05:28:09 INFO mapreduce.Job: map 100% reduce 17%
17/02/12 05:28:10 INFO mapreduce.Job: map 100% reduce 39%
17/02/12 05:28:11 INFO mapreduce.Job: map 100% reduce 46%
17/02/12 05:28:12 INFO mapreduce.Job: map 100% reduce 51%
17/02/12 05:28:13 INFO mapreduce.Job: map 100% reduce 54%
17/02/12 05:28:14 INFO mapreduce.Job: map 100% reduce 56%
17/02/12 05:28:15 INFO mapreduce.Job: map 100% reduce 88%
17/02/12 05:28:16 INFO mapreduce.Job: map 100% reduce 90%
17/02/12 05:28:18 INFO mapreduce.Job: map 100% reduce 93%
17/02/12 05:28:18 INFO mapreduce.Job: Task Id : attempt_1486663266028_2653_r_000020_0, Status : FAILED
Error: Java heap space
17/02/12 05:28:19 INFO mapreduce.Job: map 100% reduce 91%
17/02/12 05:28:20 INFO mapreduce.Job: Task Id : attempt_1486663266028_2653_r_000021_0, Status : FAILED
Error: Java heap space
17/02/12 05:28:22 INFO mapreduce.Job: Task Id : attempt_1486663266028_2653_r_000027_0, Status : FAILED
Error: Java heap space
17/02/12 05:28:23 INFO mapreduce.Job: map 100% reduce 89%
17/02/12 05:28:24 INFO mapreduce.Job: map 100% reduce 90%
17/02/12 05:28:24 INFO mapreduce.Job: Task Id : attempt_1486663266028_2653_r_000029_0, Status : FAILED
Error: Java heap space
Here is my reducer code..
public class MyReducer extends Reducer<NullWritable, Text, NullWritable, Text> {
private Logger logger = Logger.getLogger(MyReducer.class);
StringBuilder sb = new StringBuilder();
private MultipleOutputs<NullWritable, Text> multipleOutputs;
public void setup(Context context) {
logger.info("Inside Reducer.");
multipleOutputs = new MultipleOutputs<NullWritable, Text>(context);
}
@Override
public void reduce(NullWritable Key, Iterable<Text> values, Context context)
throws IOException, InterruptedException {
for (Text value : values) {
final String valueStr = value.toString();
if (valueStr.contains("Japan")) {
sb.append(valueStr.substring(0, valueStr.length() - 20));
} else if (valueStr.contains("SelfSourcedPrivate")) {
sb.append(valueStr.substring(0, valueStr.length() - 29));
} else if (valueStr.contains("SelfSourcedPublic")) {
sb.append(value.toString().substring(0, valueStr.length() - 29));
} else if (valueStr.contains("ThirdPartyPrivate")) {
sb.append(valueStr.substring(0, valueStr.length() - 25));
}
}
multipleOutputs.write(NullWritable.get(), new Text(sb.toString()), "MyFileName");
}
public void cleanup(Context context) throws IOException, InterruptedException {
multipleOutputs.close();
}
}
Can you suggest any change that will solve my problem. If we use combiner class will it improve?
Finally i manged to resolve it .
I just used
multipleOutputs.write(NullWritable.get(), new Text(sb.toString()),strName);
inside the for loop and that solved my problem .I have tested it with very huge data set 19 gb file and it worked fine for me . This is my final solution .Initially i thought it might create many objects but it is working fine for me .Map reduce is also getting competed very fast .