We have a number of small projects within our system running on Linux (Slackware 7-11, slowly migrating to RHEL 6.0). Around 50-100 applications and 15-20 libraries. Almost all our applications use one or more of our libraries. Our source tree looks something like this:
/app1
/app2
/app3
/include
/foo/app4
/foo/app5
/foo/app6
/foo/lib1
/foo/lib2
/lib/lib3
/lib/lib4
/lib/include
Now, I've done some work creating some CMakeLists.txt files and built most of the libs and some of the apps. I'm fairly comfortable with using cmake to build. I did this with v2.6, and I recently (an hour ago) upgraded to 2.8. Each of the above projects have their own CMakeLists.txt file specific to the project to do building and installation (no packaging, yet).
I have a requirement to make use of and enforce continuous integration. I've installed and played around with Jenkins, and from what I've seen I'm very impressed. I'm also evaluating JIRA to do our issue tracking.
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do? Would it be better to tell cmake to look for the lib's source directory, use the export interface or the recently introduced ExternalProject_Add?
Because I'm going to be using Jenkins, I cannot be guaranteed that cmake can find the source or build directory. Of course, I can tell Jenkins to build the projects in order (or at least, build the dependencies first). If an update to a library breaks the building of another project, then I guess it'll be up to someone with 3/4 of a wit to determine this.
Thank you in advance
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do?
No it is not a bad thing to do, but your build should reproduce resources from scratch. Things like portability and fixing build bugs will become an issue if things need to be pre-installed in the system outside of the build process. If you are able to do it other ways as you mentioned I would suggest that way, but if its going to make your build that much longer, its something you need to feel out. My ideology is everything should be movable to a new Jenkins machine with a fresh install at the drop of a hat, again this always isn't achievable, but something to strive for.
Because I'm going to be using Jenkins, I cannot be guaranteed that
cmake can find the source or build directory. Of course, I can tell
Jenkins to build the projects in order (or at least, build the
dependencies first). If an update to a library breaks the building of
another project, then I guess it'll be up to someone with 3/4 of a wit
to determine this.
Well one of the things I do in interdependent jobs is that on the successful building of one jobs triggers the job that depends on it. So for example if A depends on B, and A fail, B will never be run and whoever created the issue in build A is responsible for it and so on. This prevents a cascading affect of broken build that all were caused by a broken dependency. I would suggest that you keep files in a particular build in its job folder and specify to the dependency the location of the required files. Again keep your builds separate and clean.
I'm also evaluating JIRA to do our issue tracking.
I highly recommend JIRA as an issue tracking system for company; You might want to look at this Jenkins plugin for integration. If your using git, and you dont mind hosting your code off site, I would GitHub issues a shot as well.
Goodluck you seem to be on the right track.