-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support caching java dependencies for maven projects #664
Comments
I see a different approach here #663 that would work for gradle projects too |
Hey @pierDipi, how well would you say #663 covers your use case? Is there anything that could be changed to make it work better for your use case (e.g. not to be too much hassle to set up when fetching entire projects)? (disclaimer: not a cachi2 maintainer, just trying to see if #663 could be improved upon) |
My only problem with it is that as it is I don't know if we have existing tooling or will provide one to create that custom "lock file". The project we're trying to build has like 230 dependencies (including transitive ones and it's a relatively small/medium size project), so without a companion tool to help with that it becomes very tedious to create and maintain that lock file over time and, at the same time, I wouldn't want every team to create their own bespoke tool. |
The short answer is no, not currently, and most likely not in the future.
I definitely agree with this sentiment. While I can see how #663 could be used for your use case, I think it is not a 100% match. The feature, as I understand it, is more for fetching one-off artifacts from maven, when fetching an entire build (or all its dependencies) is inefficient or costly. Again, not a cachi2 maintainer, or a java expert in any way, but I can possibly see your use case as a separate cachi2 package manager, if that's a typical way for a java project to be structured and set up for a hermetic build. |
For Java, there is no reliable alternative to capturing all the necessary dependencies for a build besides running the build. So there is that as the first step in prefetching. We could generate a lockfile from a Maven repo content on disk except in some edge cases it will be challenging to determine the proper values of classifiers, versions and types based on filenames. |
I may be missing something or the logic is flawed in some cases but in the POC here openshift-knative/eventing-kafka-broker#1273, I don't think the actual build is running when caching dependencies since there is no project code to build at that stage. I had to add all the modules to the maven reactor via At an high level what I did was: Ensure
|
Do you create a POM-only module layout w/o adding the source code and run
|
Actually, that's wrong, |
yes, |
Hi guys, thank you for opening this issue and the ongoing discussion. We don't plan to implement Java support to Cachi2, but of course, you're more than welcome to try to contribute to our project. We would much appreciate it, and we'll try to support you with reviews. |
I'm trying to have hermetic builds for a Java project using Maven in Konflux, however, since cachi2 doesn't support caching maven dependencies, the only reasonable way I found was to create an intermediate image that downloads dependencies [1] but that doesn't pass the default enterprise contract since using intermediate images causes the error
Base image "xyz" is from a disallowed registry.
Here is the PR with proof of concept using the intermediate image openshift-knative/eventing-kafka-broker#1273
[1] https://github.com/openshift-knative/eventing-kafka-broker/blob/e1355b833093404b5e5e13f5a7bba1fc241cf49c/openshift/ci-operator/static-images/dispatcher/konflux/Dockerfile.deps
Potential Solution
To cache dependencies in Maven, that also supports multiple modules, we need to:
pom.xml
files in a temporary directory, respecting the file system structuremvn package dependency:go-offline <user provided flags> -Dmaven.repo.local=<cachi2-maven-deps-directory>
find <cachi2-maven-deps-directory> -path "*_remote.repositories" | xargs -I{} rm {}
The text was updated successfully, but these errors were encountered: