Maven2 is driving me crazy during the experimentation / quick and dirty mock-up phase of development.
I have a pom.xml
file that defines the dependencies for the web-app framework I want to use, and I can quickly generate starter projects from that file. However, sometimes I want to link to a 3rd party library that doesn't already have a pom.xml
file defined, so rather than create the pom.xml
file for the 3rd party lib by hand and install it, and add the dependency to my pom.xml
, I would just like to tell Maven: "In addition to my defined dependencies, include any jars that are in /lib
too."
It seems like this ought to be simple, but if it is, I am missing something.
Any pointers on how to do this are greatly appreciated. Short of that, if there is a simple way to point maven to a /lib
directory and easily create a pom.xml
with all the enclosed jars mapped to a single dependency which I could then name / install and link to in one fell swoop would also suffice.
Using
<scope>system</scope>
is a terrible idea for reasons explained by others, installing the file manually to your local repository makes the build unreproducible, and using<url>file://${project.basedir}/repo</url>
is not a good idea either because (1) that may not be a well-formedfile
URL (e.g. if the project is checked out in a directory with unusual characters), (2) the result is unusable if this project’s POM is used as a dependency of someone else’s project.Assuming you are unwilling to upload the artifact to a public repository, Simeon’s suggestion of a helper module does the job. But there is an easier way now…
The Recommendation
Use non-maven-jar-maven-plugin. Does exactly what you were asking for, with none of the drawbacks of the other approaches.
After having really long discussion with CloudBees guys about properly maven packaging of such kind of JARs, they made an interesting good proposal for a solution:
Creation of a fake Maven project which attaches a pre-existing JAR as a primary artifact, running into belonged POM install:install-file execution. Here is an example of such kinf of POM:
But in order to implement it, existing project structure should be changed. First, you should have in mind that for each such kind of JAR there should be created different fake Maven project (module). And there should be created a parent Maven project including all sub-modules which are : all JAR wrappers and existing main project. The structure could be :
When parent running via mvn:install or mvn:packaging is forced and sub-modules will be executed. That could be concerned as a minus here, since project structure should be changed, but offers a non static solution at the end
Note: When using the System scope (as mentioned on this page), Maven needs absolute paths.
If your jars are under your project's root, you'll want to prefix your systemPath values with ${basedir}.
I alluded to some python code in a comment to the answer from @alex lehmann's , so am posting it here.
Problems of popular approaches
Most of the answers you'll find around the internet will suggest you to either install the dependency to your local repository or specify a "system" scope in the
pom
and distribute the dependency with the source of your project. But both of these solutions are actually flawed.Why you shouldn't apply the "Install to Local Repo" approach
When you install a dependency to your local repository it remains there. Your distribution artifact will do fine as long as it has access to this repository. The problem is in most cases this repository will reside on your local machine, so there'll be no way to resolve this dependency on any other machine. Clearly making your artifact depend on a specific machine is not a way to handle things. Otherwise this dependency will have to be locally installed on every machine working with that project which is not any better.
Why you shouldn't apply the "System Scope" approach
The jars you depend on with the "System Scope" approach neither get installed to any repository or attached to your target packages. That's why your distribution package won't have a way to resolve that dependency when used. That I believe was the reason why the use of system scope even got deprecated. Anyway you don't want to rely on a deprecated feature.
The static in-project repository solution
After putting this in your
pom
:for each artifact with a group id of form
x.y.z
Maven will include the following location inside your project dir in its search for artifacts:To elaborate more on this you can read this blog post.
Use Maven to install to project repo
Instead of creating this structure by hand I recommend to use a Maven plugin to install your jars as artifacts. So, to install an artifact to an in-project repository under
repo
folder execute:If you'll choose this approach you'll be able to simplify the repository declaration in
pom
to:A helper script
Since executing installation command for each lib is kinda annoying and definitely error prone, I've created a utility script which automatically installs all the jars from a
lib
folder to a project repository, while automatically resolving all metadata (groupId, artifactId and etc.) from names of files. The script also prints out the dependencies xml for you to copy-paste in yourpom
.Include the dependencies in your target package
When you'll have your in-project repository created you'll have solved a problem of distributing the dependencies of the project with its source, but since then your project's target artifact will depend on non-published jars, so when you'll install it to a repository it will have unresolvable dependencies.
To beat this problem I suggest to include these dependencies in your target package. This you can do with either the Assembly Plugin or better with the OneJar Plugin. The official documentaion on OneJar is easy to grasp.
I found another way to do this, see here from a Heroku post
To summarize (sorry about some copy & paste)
repo
directory under your root folder:pom.xml
: