We have project which uses gcc and make files. Project also contains of one big subproject (SDK) and a lot of relatively small subprojects which use that SDK and some shared framework.
We use precompiled headers, but that helps only for re-compilation to be faster.
Is there any known techniques and tools to help with build-time optimizations? Or maybe you know some articles/resources about this or related topics?
If you have multiple computers available gcc is well distributed by distcc.
You can also use ccache in addition.
All this works with very little changes of the makefiles.
Using small files may not always be a good recommendation. A disk have a 32 or 64K min sector size, with a file taking at least a sector. So 1024 files of 3K size (small code inside) will actually take 32 or 64 Meg on disk, instead of the expected 3 meg. 32/64 meg that needs to be read by the drive. If files are dispersed around on the disk you increase read time even more with seek time. This is helped with Disk Cache obviously, to a limit. pre-compiled header can also be of good help alleviating this.
So with due respect to coding guidelines, there is no point in going out of them just to place each strcuct, typedef or utility class into separate files.
You can use distcc distributed compiler to reduce the build time if you have access to several machines. Here's an article from from IBM developerWorks related to distcc and how you can use it: http://www.ibm.com/developerworks/linux/library/l-distcc.html
Another method to reduce build time is to use precompiled headers. Here's a starting point for gcc.
Also don't forget to use -j when building with make if your machine has more than one cpu/core(2x the number of cores/cpus is just fine).
http://ccache.samba.org/ speeds up big time.
I work on a middle sized project, and that's the only thing we do to speed up the compile time.