I'm running the following kind of pipeline:
digestA: hugefileB hugefileC
cat $^ > $@
rm $^
hugefileB:
touch $@
hugefileC:
touch $@
The targets hugefileB and hugefileC are very big and take a long time to compute (and need the power of Make). But once digestA has been created, there is no need to keep its dependencies: it deletes those dependencies to free up disk space.
Now, if I invoke 'make' again, hugefileB and hugefileC will be rebuilt, whereas digestA is already ok.
Is there any way to tell 'make' to avoid to re-comile the dependencies ?
NOTE: I don't want to build the two dependencies inside the rules for 'digestA'.
The correct way is to not delete the files, as that removes the information that
make
uses to determine whether to rebuild the files.Recreating them as empty does not help because
make
will then assume that the empty files are fully built.If there is a way to merge digests, then you could create one from each of the huge files, which is then kept, and the huge file automatically removed as it is an intermediate.
Use "intermediate files" feature of GNU Make:
So, adding the following line to the Makefile should be enough:
Invoking make for the first time:
And the next time:
I would recommend you to create pseudo-cache files that are created by the
hugefileB
andhugeFileC
targets.Then have
digestA
depend on those cache files, because you know they will not change again until you manually invoke the expensive targets.If you mark
hugefileB
andhugefileC
as intermediate files, you will get the behavior you want:For example:
Note that you do not need the explicit
rm $^
command anymore -- gmake automatically deletes intermediate files at the end of the build.