Can I write a Hadoop code that has only Mappers and Combiners (i.e. mini-reducers with no reducer)?
job.setMapperClass(WordCountMapper.class);
job.setCombinerClass(WordCountReducer.class);conf.setInt("mapred.reduce.tasks", 0);
I was trying to do so but I always see that I have one reduce task on the job tracker link
Launched reduce tasks = 1
How can I delete reducers while keeping combiners? is that possible?
You need to tell your job that you don't care about the reducer: JobConf.html#setNumReduceTasks(int)
You can achieve the something with IdentityReducer.
I'm not sure whether you can keep combiners but I will start with the previous lines.
In the case you describe you should use Reducers. Use as key: Context.getInputSplit().getPath() + Context.getInputSplit().getStart() - this combination is unique for each Mapper.