[VOTE] Maven incremental build for BIG-sized projects with local and remote caching

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

[VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Maximilian Novikov

Hi All,

 

We want to create upstream change to Maven to support true incremental build for big-sized projects.

To raise a pull request we have to pass long chain of Deutsche Bank’s internal procedures. So, before starting the process we would like to get your feedback regarding this feature.

 

Motivation:

 

Our project is hosted in mono-repo and contains ~600 modules. All modules has the same SNAPSHOT version.

There are lot of test automation around this, everything is tested before merge into release branch.

 

Current setup helps us to simplify build/release/dependency management for 10+ teams those contribute into codebase. We can release everything in 1-click.

The major drawback of such approach is build time: full local build took 45-60 min (-T8), CI build ~25min(-T16).

 

To speed-up our build we needed 2 features: incremental build and shared cache.

Initially we started to think about migration to Gradle or Bazel. As migration costs for the mentioned tools were too high, we decided to add similar functionality into Maven.

 

Current results we get: 1-2 mins for local build(-T8) if build was cached by CI, CI build ~5 mins (-T16).

 

Feature description:

 

The idea is to calculate checksum for inputs and save outputs in cache.

Each node checksum calculated with:

 

·         Effective POM hash

·         Sources hash

·         Dependencies hash (dependencies within multi-module project)

 

Project sources inputs are searched inside project + all paths from plugins configuration:

How does it work in practice:

 

1.       CI: runs builds and stores outputs in shared cache

2.       CI: reuse outputs for same inputs, so time is decreasing

3.       Locally: when I checkout branch and run ‘install’ for whole project, I get all actual snapshots from remote cache for this branch

4.       Locally: if I change multiple modules in tree, only changed subtree is rebuilt

 

Impact on current Maven codebase is very localized (MojoExecutor, where we injected cache controller).

Caching can be activated/deactivated by property, so current maven flow will work as is.

 

And the big plus is that you don’t need to re-work your current project. Caching should work out of box, just need to add config in .mvn folder.

 

Please let us know what do you think. We are ready to invest in this feature and address any further feedback.

 

Kind regards,

Max

 



---
This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received this e-mail in error) please notify the sender immediately and delete this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden.

Please refer to https://www.db.com/disclosures for additional EU corporate and regulatory disclosures and to http://www.db.com/unitedkingdom/content/privacy.htm for information about privacy.
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Tibor Digana
In theory, the incremental compiler would make it faster.
But this can be told only if you present a demo project with has trivial
tests taking much less time to complete than the compiler.

In reality the tests in huge projects take significantly longer time than
the compiler.
Some developers say "switch off all the tests" in the release phase but
that's wrong because then the quality goes down and methodologies are
broken.

I can see a big problem that we do not have an interface between Surefire
and Compiler plugin negotiating which tests have been modified including
modules and classes in the entire structure.

Having incremental compiler is easy, just use compiler:3.8.1 or use the
Takari compiler.
But IMO the biggest benefit in performance would be after having the truly
incremental test executor.

On Fri, Sep 13, 2019 at 10:46 PM Maximilian Novikov <
[hidden email]> wrote:

> Hi All,
>
>
>
> *We want to create upstream change to Maven* to support true incremental
> build for big-sized projects.
>
> To raise a pull request we have to pass long chain of Deutsche Bank’s
> internal procedures. So, *before starting the process we would like to
> get your feedback regarding this feature*.
>
>
>
> *Motivation:*
>
>
>
> Our project is hosted in mono-repo and contains ~600 modules. All modules
> has the same SNAPSHOT version.
>
> There are lot of test automation around this, everything is tested before
> merge into release branch.
>
>
>
> Current setup helps us to simplify build/release/dependency management for
> 10+ teams those contribute into codebase. We can release everything in
> 1-click.
>
> The major drawback of such approach is build time: *full local build took
> 45-60 min (*-T8)*, CI build ~25min(*-T16*)*.
>
>
>
> To speed-up our build we needed 2 features: incremental build and shared
> cache.
>
> Initially we started to think about migration to Gradle or Bazel. As
> migration costs for the mentioned tools were too high, we decided to add
> similar functionality into Maven.
>
>
>
> Current results we get: *1-2 mins for local build(*-T8*)* if build was
> cached by CI*, CI build ~5 mins (*-T16*).*
>
>
>
> *Feature description:*
>
>
>
> The idea is to calculate checksum for inputs and save outputs in cache.
>
> [image: image2019-8-27_20-0-14.png]
>
> Each node checksum calculated with:
>
>
>
> ·         Effective POM hash
>
> ·         Sources hash
>
> ·         Dependencies hash (dependencies within multi-module project)
>
>
>
> Project sources inputs are searched inside project + all paths from
> plugins configuration:
>
> [image: image2019-8-30_10-28-56.png]
>
> How does it work in practice:
>
>
>
> 1.       CI: runs builds and stores outputs in shared cache
>
> 2.       CI: reuse outputs for same inputs, so time is decreasing
>
> 3.       Locally: when I checkout branch and run ‘install’ for whole
> project, I get all actual snapshots from remote cache for this branch
>
> 4.       Locally: if I change multiple modules in tree, only changed
> subtree is rebuilt
>
>
>
> Impact on current Maven codebase is very localized (MojoExecutor, where we
> injected cache controller).
>
> Caching can be activated/deactivated by property, so current maven flow
> will work as is.
>
>
>
> And the big plus is that you don’t need to re-work your current project.
> Caching should work out of box, just need to add config in .mvn folder.
>
>
>
> Please let us know what do you think. We are ready to invest in this
> feature and address any further feedback.
>
>
>
> Kind regards,
>
> Max
>
>
>
>
> ---
> This e-mail may contain confidential and/or privileged information. If you
> are not the intended recipient (or have received this e-mail in error)
> please notify the sender immediately and delete this e-mail. Any
> unauthorized copying, disclosure or distribution of the material in this
> e-mail is strictly forbidden.
>
> Please refer to https://www.db.com/disclosures for additional EU
> corporate and regulatory disclosures and to
> http://www.db.com/unitedkingdom/content/privacy.htm for information about
> privacy.
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Alexander Ashitkin
Indeed we have a kind of the option 2 with variations. Current implementation is opt-in feature driven by configuration with some metadata of required cache behavior and hints.

Maven extensions is the option, but we would love to have it in maven itself which is in interest of maven community i believe. Extension is a way we are trying to avoid and even not sure it could be implemented as extension as it requires changes in maven core.

Thanks in advance, Aleks

On 2019/09/13 21:37:15, Romain Manni-Bucau <[hidden email]> wrote:

> There are multiple possible incremental support:
>
> 1. Scm related: do a status and rebuild downstream reactor
> 2. Full and module build graph: seems it is the one you target, ie bypass
> modules without change. Note that it only works if upstream graph is taken
> into account.
> 3. Full build: each mojo has incremental support so the full build gets it.
> Issue is that it requires each mojo to know if it needs to be executed or
> give enough info to the mojo executor to do so (gradle requires all
> inputs/outputs to assume this state - which is still just an heuristic and
> not 100% reliable).
>
> In current state, 2. sounds like a good option since 3 can require  a loot
> of work for external plugins (today's builds have a lot more of not maven
> provide plugins than core plugins).
> Now, we should be able to activate it or not so having a cacheLocation
> config in settings.xml can be good.
>
> Side notes:
>
> 1. having it on by default will break builds - reactor is deterministic and
> bypassing a module can break a build since it can init maven properties -
> for ex - for next modules
> 2. You cant find all in/out paths from the pom in general so your algo is
> not generic, a meta config can be needed in .mvn
> 3. We should let a mojo be able to disable that to replace default logic
> (surefire is a good example where it must be refined and it can save hours
> there ;))
> 4. Let's try to impl it as a mvn extension first then if it works well on
> multiple big project get it to core?
>
> Romain
>
>
>
> Le ven. 13 sept. 2019 à 23:18, Tibor Digana <[hidden email]> a
> écrit :
>
> > In theory, the incremental compiler would make it faster.
> > But this can be told only if you present a demo project with has trivial
> > tests taking much less time to complete than the compiler.
> >
> > In reality the tests in huge projects take significantly longer time than
> > the compiler.
> > Some developers say "switch off all the tests" in the release phase but
> > that's wrong because then the quality goes down and methodologies are
> > broken.
> >
> > I can see a big problem that we do not have an interface between Surefire
> > and Compiler plugin negotiating which tests have been modified including
> > modules and classes in the entire structure.
> >
> > Having incremental compiler is easy, just use compiler:3.8.1 or use the
> > Takari compiler.
> > But IMO the biggest benefit in performance would be after having the truly
> > incremental test executor.
> >
> > On Fri, Sep 13, 2019 at 10:46 PM Maximilian Novikov <
> > [hidden email]> wrote:
> >
> > > Hi All,
> > >
> > >
> > >
> > > *We want to create upstream change to Maven* to support true incremental
> > > build for big-sized projects.
> > >
> > > To raise a pull request we have to pass long chain of Deutsche Bank’s
> > > internal procedures. So, *before starting the process we would like to
> > > get your feedback regarding this feature*.
> > >
> > >
> > >
> > > *Motivation:*
> > >
> > >
> > >
> > > Our project is hosted in mono-repo and contains ~600 modules. All modules
> > > has the same SNAPSHOT version.
> > >
> > > There are lot of test automation around this, everything is tested before
> > > merge into release branch.
> > >
> > >
> > >
> > > Current setup helps us to simplify build/release/dependency management
> > for
> > > 10+ teams those contribute into codebase. We can release everything in
> > > 1-click.
> > >
> > > The major drawback of such approach is build time: *full local build took
> > > 45-60 min (*-T8)*, CI build ~25min(*-T16*)*.
> > >
> > >
> > >
> > > To speed-up our build we needed 2 features: incremental build and shared
> > > cache.
> > >
> > > Initially we started to think about migration to Gradle or Bazel. As
> > > migration costs for the mentioned tools were too high, we decided to add
> > > similar functionality into Maven.
> > >
> > >
> > >
> > > Current results we get: *1-2 mins for local build(*-T8*)* if build was
> > > cached by CI*, CI build ~5 mins (*-T16*).*
> > >
> > >
> > >
> > > *Feature description:*
> > >
> > >
> > >
> > > The idea is to calculate checksum for inputs and save outputs in cache.
> > >
> > > [image: image2019-8-27_20-0-14.png]
> > >
> > > Each node checksum calculated with:
> > >
> > >
> > >
> > > ·         Effective POM hash
> > >
> > > ·         Sources hash
> > >
> > > ·         Dependencies hash (dependencies within multi-module project)
> > >
> > >
> > >
> > > Project sources inputs are searched inside project + all paths from
> > > plugins configuration:
> > >
> > > [image: image2019-8-30_10-28-56.png]
> > >
> > > How does it work in practice:
> > >
> > >
> > >
> > > 1.       CI: runs builds and stores outputs in shared cache
> > >
> > > 2.       CI: reuse outputs for same inputs, so time is decreasing
> > >
> > > 3.       Locally: when I checkout branch and run ‘install’ for whole
> > > project, I get all actual snapshots from remote cache for this branch
> > >
> > > 4.       Locally: if I change multiple modules in tree, only changed
> > > subtree is rebuilt
> > >
> > >
> > >
> > > Impact on current Maven codebase is very localized (MojoExecutor, where
> > we
> > > injected cache controller).
> > >
> > > Caching can be activated/deactivated by property, so current maven flow
> > > will work as is.
> > >
> > >
> > >
> > > And the big plus is that you don’t need to re-work your current project.
> > > Caching should work out of box, just need to add config in .mvn folder.
> > >
> > >
> > >
> > > Please let us know what do you think. We are ready to invest in this
> > > feature and address any further feedback.
> > >
> > >
> > >
> > > Kind regards,
> > >
> > > Max
> > >
> > >
> > >
> > >
> > > ---
> > > This e-mail may contain confidential and/or privileged information. If
> > you
> > > are not the intended recipient (or have received this e-mail in error)
> > > please notify the sender immediately and delete this e-mail. Any
> > > unauthorized copying, disclosure or distribution of the material in this
> > > e-mail is strictly forbidden.
> > >
> > > Please refer to https://www.db.com/disclosures for additional EU
> > > corporate and regulatory disclosures and to
> > > http://www.db.com/unitedkingdom/content/privacy.htm for information
> > about
> > > privacy.
> > >
> >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

ljnelson
In reply to this post by Maximilian Novikov
On Fri, Sep 13, 2019 at 11:01 PM Alexander Ashitkin <
[hidden email]> wrote:

> This feature is true incremental build – you don’t build modules which
> were not changed at all and build only modified/changed ones.
>

Suppose module B depends on module A and I change A.  Does B get rebuilt in
your system?

Best,
Laird
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Tibor Digana
In reply to this post by Tibor Digana
oh yeah, exactly opposite.
Jenkins has several ways to create Maven build configuration and it knows
where the repo and workspace is, it knows where to store the archive, it
knows when the build failed.
We cannot take the responsibility because the build may fail for whatever
reason and we do not know whether to keep the folders or delete all
"/target" folders or just to delete only the failed one. The user knows it.
We cannot archive the folders because we may significantly cause very high
disk usage which would be without the control of CI. And we cannot take the
responsibility of lifetime of these archives. It is all the property of
Jenkins and Jenkins has the feature and management plugins where the
workspace may retain for certain period of time, archives are limited in
some way. The archives can be stored in another folder and we should not
adopt these responsibilities because then we suddenly end up with all the
knowledge of the distributed system and then we as maven project would end
as unmaintainable project with many more issues in Jira and requirements we
would be able to find the spare time to develop.

On Sat, Sep 14, 2019 at 1:25 PM Romain Manni-Bucau <[hidden email]>
wrote:

> Tibor, maven is the only one with the logic to give any cache the data it
> needs. Jenkins alone can't since it does not own the reactor nor plugin I/O
> values.
>
> Le sam. 14 sept. 2019 à 12:45, Tibor Digana <[hidden email]> a
> écrit :
>
> > But I do not understand why the Maven should be responsible for the
> project
> > cahe control/management of "/target" directories.
> > It is a responsibility of the build manager which is the Jenkins.
> > The Jenkins has the ability to archive files and such property already
> > exists in the Jenkins.
> >
> > So the Jenkins has a full knowledge about:
> >
> > 1. how long the workspace content retains intact
> > 2. what commit hash is for the last build/job/branch
> > 3. and what commit was successful
> >
> > If the target directories retain intact (or renewed from archive) in the
> > workspace for very long time and the workspace was reused by the next
> build
> > then I would say that the improvement should work as it is on CI level.
> >
> > Maybe what is necessary is only that improvement in Maven where we would
> > obtain the list of modules or directories of changes in the current
> commit.
> > Then the Maven can highly optimize its own build steps and build only
> those
> > modules which have been changed and their dependent modules.
> > So the interface between CI and Maven is needed in a kind of extension or
> > the class MavenCli can be extended with some new entrypoint.
> >
> > But I do not hink that Maven has to take care of responsibilities of CI
> > (project cache mgmt), that's not our task I would say and we as Maven
> would
> > never know all about the miscellaneous CI specifics and therefore we
> would
> > not cope with CI related troubles.
> >
> > Cheers
> > Tibor17
> >
> >
> >
> > On Sat, Sep 14, 2019 at 11:08 AM Robert Scholte <[hidden email]>
> > wrote:
> >
> > > On Fri, 13 Sep 2019 23:37:15 +0200, Romain Manni-Bucau
> > > <[hidden email]> wrote:
> > >
> > > > There are multiple possible incremental support:
> > > >
> > > > 1. Scm related: do a status and rebuild downstream reactor
> > > > 2. Full and module build graph: seems it is the one you target, ie
> > bypass
> > > > modules without change. Note that it only works if upstream graph is
> > > > taken
> > > > into account.
> > > > 3. Full build: each mojo has incremental support so the full build
> gets
> > > > it.
> > > > Issue is that it requires each mojo to know if it needs to be
> executed
> > or
> > > > give enough info to the mojo executor to do so (gradle requires all
> > > > inputs/outputs to assume this state - which is still just an
> heuristic
> > > > and
> > > > not 100% reliable).
> > > >
> > > > In current state, 2. sounds like a good option since 3 can require  a
> > > > loot
> > > > of work for external plugins (today's builds have a lot more of not
> > maven
> > > > provide plugins than core plugins).
> > > > Now, we should be able to activate it or not so having a
> cacheLocation
> > > > config in settings.xml can be good.
> > > >
> > > > Side notes:
> > > >
> > > > 1. having it on by default will break builds - reactor is
> deterministic
> > > > and
> > > > bypassing a module can break a build since it can init maven
> > properties -
> > > > for ex - for next modules
> > > > 2. You cant find all in/out paths from the pom in general so your
> algo
> > is
> > > > not generic, a meta config can be needed in .mvn
> > > > 3. We should let a mojo be able to disable that to replace default
> > logic
> > > > (surefire is a good example where it must be refined and it can save
> > > > hours
> > > > there ;))
> > > > 4. Let's try to impl it as a mvn extension first then if it works
> well
> > on
> > > > multiple big project get it to core?
> > >
> > > Did anyone Google for "maven extension build cache"? There are already
> > > commercial solutions for it.
> > > Even though I would like to see improvements in this area, the old
> > > architecture of Maven makes it quite hard to move to that situation.
> > > First
> > > of all it requires changes to the Plugin API (without breaking
> backwards
> > > compatibility) to have support out of the box.
> > >
> > > Robert
> > >
> > > >
> > > > Romain
> > > >
> > > >
> > > >
> > > > Le ven. 13 sept. 2019 à 23:18, Tibor Digana <[hidden email]>
> a
> > > > écrit :
> > > >
> > > >> In theory, the incremental compiler would make it faster.
> > > >> But this can be told only if you present a demo project with has
> > trivial
> > > >> tests taking much less time to complete than the compiler.
> > > >>
> > > >> In reality the tests in huge projects take significantly longer time
> > > >> than
> > > >> the compiler.
> > > >> Some developers say "switch off all the tests" in the release phase
> > but
> > > >> that's wrong because then the quality goes down and methodologies
> are
> > > >> broken.
> > > >>
> > > >> I can see a big problem that we do not have an interface between
> > > >> Surefire
> > > >> and Compiler plugin negotiating which tests have been modified
> > including
> > > >> modules and classes in the entire structure.
> > > >>
> > > >> Having incremental compiler is easy, just use compiler:3.8.1 or use
> > the
> > > >> Takari compiler.
> > > >> But IMO the biggest benefit in performance would be after having the
> > > >> truly
> > > >> incremental test executor.
> > > >>
> > > >> On Fri, Sep 13, 2019 at 10:46 PM Maximilian Novikov <
> > > >> [hidden email]> wrote:
> > > >>
> > > >> > Hi All,
> > > >> >
> > > >> >
> > > >> >
> > > >> > *We want to create upstream change to Maven* to support true
> > > >> incremental
> > > >> > build for big-sized projects.
> > > >> >
> > > >> > To raise a pull request we have to pass long chain of Deutsche
> > Bank’s
> > > >> > internal procedures. So, *before starting the process we would
> like
> > to
> > > >> > get your feedback regarding this feature*.
> > > >> >
> > > >> >
> > > >> >
> > > >> > *Motivation:*
> > > >> >
> > > >> >
> > > >> >
> > > >> > Our project is hosted in mono-repo and contains ~600 modules. All
> > > >> modules
> > > >> > has the same SNAPSHOT version.
> > > >> >
> > > >> > There are lot of test automation around this, everything is tested
> > > >> before
> > > >> > merge into release branch.
> > > >> >
> > > >> >
> > > >> >
> > > >> > Current setup helps us to simplify build/release/dependency
> > management
> > > >> for
> > > >> > 10+ teams those contribute into codebase. We can release
> everything
> > in
> > > >> > 1-click.
> > > >> >
> > > >> > The major drawback of such approach is build time: *full local
> build
> > > >> took
> > > >> > 45-60 min (*-T8)*, CI build ~25min(*-T16*)*.
> > > >> >
> > > >> >
> > > >> >
> > > >> > To speed-up our build we needed 2 features: incremental build and
> > > >> shared
> > > >> > cache.
> > > >> >
> > > >> > Initially we started to think about migration to Gradle or Bazel.
> As
> > > >> > migration costs for the mentioned tools were too high, we decided
> to
> > > >> add
> > > >> > similar functionality into Maven.
> > > >> >
> > > >> >
> > > >> >
> > > >> > Current results we get: *1-2 mins for local build(*-T8*)* if build
> > was
> > > >> > cached by CI*, CI build ~5 mins (*-T16*).*
> > > >> >
> > > >> >
> > > >> >
> > > >> > *Feature description:*
> > > >> >
> > > >> >
> > > >> >
> > > >> > The idea is to calculate checksum for inputs and save outputs in
> > > >> cache.
> > > >> >
> > > >> > [image: image2019-8-27_20-0-14.png]
> > > >> >
> > > >> > Each node checksum calculated with:
> > > >> >
> > > >> >
> > > >> >
> > > >> > ·         Effective POM hash
> > > >> >
> > > >> > ·         Sources hash
> > > >> >
> > > >> > ·         Dependencies hash (dependencies within multi-module
> > project)
> > > >> >
> > > >> >
> > > >> >
> > > >> > Project sources inputs are searched inside project + all paths
> from
> > > >> > plugins configuration:
> > > >> >
> > > >> > [image: image2019-8-30_10-28-56.png]
> > > >> >
> > > >> > How does it work in practice:
> > > >> >
> > > >> >
> > > >> >
> > > >> > 1.       CI: runs builds and stores outputs in shared cache
> > > >> >
> > > >> > 2.       CI: reuse outputs for same inputs, so time is decreasing
> > > >> >
> > > >> > 3.       Locally: when I checkout branch and run ‘install’ for
> whole
> > > >> > project, I get all actual snapshots from remote cache for this
> > branch
> > > >> >
> > > >> > 4.       Locally: if I change multiple modules in tree, only
> changed
> > > >> > subtree is rebuilt
> > > >> >
> > > >> >
> > > >> >
> > > >> > Impact on current Maven codebase is very localized (MojoExecutor,
> > > >> where
> > > >> we
> > > >> > injected cache controller).
> > > >> >
> > > >> > Caching can be activated/deactivated by property, so current maven
> > > >> flow
> > > >> > will work as is.
> > > >> >
> > > >> >
> > > >> >
> > > >> > And the big plus is that you don’t need to re-work your current
> > > >> project.
> > > >> > Caching should work out of box, just need to add config in .mvn
> > > >> folder.
> > > >> >
> > > >> >
> > > >> >
> > > >> > Please let us know what do you think. We are ready to invest in
> this
> > > >> > feature and address any further feedback.
> > > >> >
> > > >> >
> > > >> >
> > > >> > Kind regards,
> > > >> >
> > > >> > Max
> > > >> >
> > > >> >
> > > >> >
> > > >> >
> > > >> > ---
> > > >> > This e-mail may contain confidential and/or privileged
> information.
> > If
> > > >> you
> > > >> > are not the intended recipient (or have received this e-mail in
> > error)
> > > >> > please notify the sender immediately and delete this e-mail. Any
> > > >> > unauthorized copying, disclosure or distribution of the material
> in
> > > >> this
> > > >> > e-mail is strictly forbidden.
> > > >> >
> > > >> > Please refer to https://www.db.com/disclosures for additional EU
> > > >> > corporate and regulatory disclosures and to
> > > >> > http://www.db.com/unitedkingdom/content/privacy.htm for
> information
> > > >> about
> > > >> > privacy.
> > > >> >
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe, e-mail: [hidden email]
> > > For additional commands, e-mail: [hidden email]
> > >
> > >
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

rfscholte
Tibor, it seems like you're missing the bigger picture.
The question is similar to what we've discussed in the past: can we define  
if surefire should be executed or not?

We should define incremental builds as "should a goal be executed or  
not?", e.g. based on the results of the previous build.
First of all: calling 'clean' makes it impossible to do incremental builds.
Next, it is the *plugin-developer* that knows best if the goal should be  
executed or not. Now it is still logic inside the plugin, but if the  
plugin API understands input and output, we can leave it up to Maven to  
decide if a goal should be executed.
The buildplan now gives us a graph of Maven Projects, but theoretically  
with such changes we could make a graph of goals. And it could detect  
useless calls of goals, because the output is never being used.
Some might recognize a Gradle concept here, and that's correct. At this  
point they were able to design something that works better compared to  
Maven. For their build cache extension they had to analyze the plugin  
descriptors, marking all parameters as either input or output. And that  
boosts the builds with their extension.

thanks,
Robert


On Sat, 14 Sep 2019 13:37:40 +0200, Tibor Digana <[hidden email]>  
wrote:

> oh yeah, exactly opposite.
> Jenkins has several ways to create Maven build configuration and it knows
> where the repo and workspace is, it knows where to store the archive, it
> knows when the build failed.
> We cannot take the responsibility because the build may fail for whatever
> reason and we do not know whether to keep the folders or delete all
> "/target" folders or just to delete only the failed one. The user knows  
> it.
> We cannot archive the folders because we may significantly cause very  
> high
> disk usage which would be without the control of CI. And we cannot take  
> the
> responsibility of lifetime of these archives. It is all the property of
> Jenkins and Jenkins has the feature and management plugins where the
> workspace may retain for certain period of time, archives are limited in
> some way. The archives can be stored in another folder and we should not
> adopt these responsibilities because then we suddenly end up with all the
> knowledge of the distributed system and then we as maven project would  
> end
> as unmaintainable project with many more issues in Jira and requirements  
> we
> would be able to find the spare time to develop.
>
> On Sat, Sep 14, 2019 at 1:25 PM Romain Manni-Bucau  
> <[hidden email]>
> wrote:
>
>> Tibor, maven is the only one with the logic to give any cache the data  
>> it
>> needs. Jenkins alone can't since it does not own the reactor nor plugin  
>> I/O
>> values.
>>
>> Le sam. 14 sept. 2019 à 12:45, Tibor Digana <[hidden email]> a
>> écrit :
>>
>> > But I do not understand why the Maven should be responsible for the
>> project
>> > cahe control/management of "/target" directories.
>> > It is a responsibility of the build manager which is the Jenkins.
>> > The Jenkins has the ability to archive files and such property already
>> > exists in the Jenkins.
>> >
>> > So the Jenkins has a full knowledge about:
>> >
>> > 1. how long the workspace content retains intact
>> > 2. what commit hash is for the last build/job/branch
>> > 3. and what commit was successful
>> >
>> > If the target directories retain intact (or renewed from archive) in  
>> the
>> > workspace for very long time and the workspace was reused by the next
>> build
>> > then I would say that the improvement should work as it is on CI  
>> level.
>> >
>> > Maybe what is necessary is only that improvement in Maven where we  
>> would
>> > obtain the list of modules or directories of changes in the current
>> commit.
>> > Then the Maven can highly optimize its own build steps and build only
>> those
>> > modules which have been changed and their dependent modules.
>> > So the interface between CI and Maven is needed in a kind of  
>> extension or
>> > the class MavenCli can be extended with some new entrypoint.
>> >
>> > But I do not hink that Maven has to take care of responsibilities of  
>> CI
>> > (project cache mgmt), that's not our task I would say and we as Maven
>> would
>> > never know all about the miscellaneous CI specifics and therefore we
>> would
>> > not cope with CI related troubles.
>> >
>> > Cheers
>> > Tibor17
>> >
>> >
>> >
>> > On Sat, Sep 14, 2019 at 11:08 AM Robert Scholte <[hidden email]>
>> > wrote:
>> >
>> > > On Fri, 13 Sep 2019 23:37:15 +0200, Romain Manni-Bucau
>> > > <[hidden email]> wrote:
>> > >
>> > > > There are multiple possible incremental support:
>> > > >
>> > > > 1. Scm related: do a status and rebuild downstream reactor
>> > > > 2. Full and module build graph: seems it is the one you target, ie
>> > bypass
>> > > > modules without change. Note that it only works if upstream graph  
>> is
>> > > > taken
>> > > > into account.
>> > > > 3. Full build: each mojo has incremental support so the full build
>> gets
>> > > > it.
>> > > > Issue is that it requires each mojo to know if it needs to be
>> executed
>> > or
>> > > > give enough info to the mojo executor to do so (gradle requires  
>> all
>> > > > inputs/outputs to assume this state - which is still just an
>> heuristic
>> > > > and
>> > > > not 100% reliable).
>> > > >
>> > > > In current state, 2. sounds like a good option since 3 can  
>> require  a
>> > > > loot
>> > > > of work for external plugins (today's builds have a lot more of  
>> not
>> > maven
>> > > > provide plugins than core plugins).
>> > > > Now, we should be able to activate it or not so having a
>> cacheLocation
>> > > > config in settings.xml can be good.
>> > > >
>> > > > Side notes:
>> > > >
>> > > > 1. having it on by default will break builds - reactor is
>> deterministic
>> > > > and
>> > > > bypassing a module can break a build since it can init maven
>> > properties -
>> > > > for ex - for next modules
>> > > > 2. You cant find all in/out paths from the pom in general so your
>> algo
>> > is
>> > > > not generic, a meta config can be needed in .mvn
>> > > > 3. We should let a mojo be able to disable that to replace default
>> > logic
>> > > > (surefire is a good example where it must be refined and it can  
>> save
>> > > > hours
>> > > > there ;))
>> > > > 4. Let's try to impl it as a mvn extension first then if it works
>> well
>> > on
>> > > > multiple big project get it to core?
>> > >
>> > > Did anyone Google for "maven extension build cache"? There are  
>> already
>> > > commercial solutions for it.
>> > > Even though I would like to see improvements in this area, the old
>> > > architecture of Maven makes it quite hard to move to that situation.
>> > > First
>> > > of all it requires changes to the Plugin API (without breaking
>> backwards
>> > > compatibility) to have support out of the box.
>> > >
>> > > Robert
>> > >
>> > > >
>> > > > Romain
>> > > >
>> > > >
>> > > >
>> > > > Le ven. 13 sept. 2019 à 23:18, Tibor Digana  
>> <[hidden email]>
>> a
>> > > > écrit :
>> > > >
>> > > >> In theory, the incremental compiler would make it faster.
>> > > >> But this can be told only if you present a demo project with has
>> > trivial
>> > > >> tests taking much less time to complete than the compiler.
>> > > >>
>> > > >> In reality the tests in huge projects take significantly longer  
>> time
>> > > >> than
>> > > >> the compiler.
>> > > >> Some developers say "switch off all the tests" in the release  
>> phase
>> > but
>> > > >> that's wrong because then the quality goes down and methodologies
>> are
>> > > >> broken.
>> > > >>
>> > > >> I can see a big problem that we do not have an interface between
>> > > >> Surefire
>> > > >> and Compiler plugin negotiating which tests have been modified
>> > including
>> > > >> modules and classes in the entire structure.
>> > > >>
>> > > >> Having incremental compiler is easy, just use compiler:3.8.1 or  
>> use
>> > the
>> > > >> Takari compiler.
>> > > >> But IMO the biggest benefit in performance would be after having  
>> the
>> > > >> truly
>> > > >> incremental test executor.
>> > > >>
>> > > >> On Fri, Sep 13, 2019 at 10:46 PM Maximilian Novikov <
>> > > >> [hidden email]> wrote:
>> > > >>
>> > > >> > Hi All,
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > *We want to create upstream change to Maven* to support true
>> > > >> incremental
>> > > >> > build for big-sized projects.
>> > > >> >
>> > > >> > To raise a pull request we have to pass long chain of Deutsche
>> > Bank’s
>> > > >> > internal procedures. So, *before starting the process we would
>> like
>> > to
>> > > >> > get your feedback regarding this feature*.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > *Motivation:*
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Our project is hosted in mono-repo and contains ~600 modules.  
>> All
>> > > >> modules
>> > > >> > has the same SNAPSHOT version.
>> > > >> >
>> > > >> > There are lot of test automation around this, everything is  
>> tested
>> > > >> before
>> > > >> > merge into release branch.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Current setup helps us to simplify build/release/dependency
>> > management
>> > > >> for
>> > > >> > 10+ teams those contribute into codebase. We can release
>> everything
>> > in
>> > > >> > 1-click.
>> > > >> >
>> > > >> > The major drawback of such approach is build time: *full local
>> build
>> > > >> took
>> > > >> > 45-60 min (*-T8)*, CI build ~25min(*-T16*)*.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > To speed-up our build we needed 2 features: incremental build  
>> and
>> > > >> shared
>> > > >> > cache.
>> > > >> >
>> > > >> > Initially we started to think about migration to Gradle or  
>> Bazel.
>> As
>> > > >> > migration costs for the mentioned tools were too high, we  
>> decided
>> to
>> > > >> add
>> > > >> > similar functionality into Maven.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Current results we get: *1-2 mins for local build(*-T8*)* if  
>> build
>> > was
>> > > >> > cached by CI*, CI build ~5 mins (*-T16*).*
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > *Feature description:*
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > The idea is to calculate checksum for inputs and save outputs  
>> in
>> > > >> cache.
>> > > >> >
>> > > >> > [image: image2019-8-27_20-0-14.png]
>> > > >> >
>> > > >> > Each node checksum calculated with:
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > ·         Effective POM hash
>> > > >> >
>> > > >> > ·         Sources hash
>> > > >> >
>> > > >> > ·         Dependencies hash (dependencies within multi-module
>> > project)
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Project sources inputs are searched inside project + all paths
>> from
>> > > >> > plugins configuration:
>> > > >> >
>> > > >> > [image: image2019-8-30_10-28-56.png]
>> > > >> >
>> > > >> > How does it work in practice:
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > 1.       CI: runs builds and stores outputs in shared cache
>> > > >> >
>> > > >> > 2.       CI: reuse outputs for same inputs, so time is  
>> decreasing
>> > > >> >
>> > > >> > 3.       Locally: when I checkout branch and run ‘install’ for
>> whole
>> > > >> > project, I get all actual snapshots from remote cache for this
>> > branch
>> > > >> >
>> > > >> > 4.       Locally: if I change multiple modules in tree, only
>> changed
>> > > >> > subtree is rebuilt
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Impact on current Maven codebase is very localized  
>> (MojoExecutor,
>> > > >> where
>> > > >> we
>> > > >> > injected cache controller).
>> > > >> >
>> > > >> > Caching can be activated/deactivated by property, so current  
>> maven
>> > > >> flow
>> > > >> > will work as is.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > And the big plus is that you don’t need to re-work your current
>> > > >> project.
>> > > >> > Caching should work out of box, just need to add config in .mvn
>> > > >> folder.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Please let us know what do you think. We are ready to invest in
>> this
>> > > >> > feature and address any further feedback.
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > Kind regards,
>> > > >> >
>> > > >> > Max
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> >
>> > > >> > ---
>> > > >> > This e-mail may contain confidential and/or privileged
>> information.
>> > If
>> > > >> you
>> > > >> > are not the intended recipient (or have received this e-mail in
>> > error)
>> > > >> > please notify the sender immediately and delete this e-mail.  
>> Any
>> > > >> > unauthorized copying, disclosure or distribution of the  
>> material
>> in
>> > > >> this
>> > > >> > e-mail is strictly forbidden.
>> > > >> >
>> > > >> > Please refer to https://www.db.com/disclosures for additional  
>> EU
>> > > >> > corporate and regulatory disclosures and to
>> > > >> > http://www.db.com/unitedkingdom/content/privacy.htm for
>> information
>> > > >> about
>> > > >> > privacy.
>> > > >> >
>> > >
>> > >  
>> ---------------------------------------------------------------------
>> > > To unsubscribe, e-mail: [hidden email]
>> > > For additional commands, e-mail: [hidden email]
>> > >
>> > >
>> >

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Alexander Ashitkin
In reply to this post by ljnelson
Hi Laird
yes, in this case B will be rebuilt - that's on of the basic requirements.

Thanks in advance

On 2019/09/14 06:29:00, Laird Nelson <[hidden email]> wrote:

> On Fri, Sep 13, 2019 at 11:01 PM Alexander Ashitkin <
> [hidden email]> wrote:
>
> > This feature is true incremental build – you don’t build modules which
> > were not changed at all and build only modified/changed ones.
> >
>
> Suppose module B depends on module A and I change A.  Does B get rebuilt in
> your system?
>
> Best,
> Laird
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

RE: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Maximilian Novikov
In reply to this post by ljnelson
Tibor,

We understand your position.

We moved from separated SCM to one SCM. We can move back, but we don't want this.

In single SCM we like:
1. Atomic commits
2. Single point of responsibility.
If someone makes incompatible change in shared library, he is responsible to update all usages. At first look It can be considered as slowness in development, but it helps us to avoid growing of technical debt. We never get in situation when projects A, B, C, D... depends on different version of shared library and we need to make major upgrade, it can block release of some apps and etc...

Now we releasing 20+ clients apps and 50+ backend components every week or even often. With multiple SCM we will need to hire a team of release managers and build engineers to coordinate and support this.

Again, we are don’t selling our approach. We implemented the missing for us feature.

PS. Just thing why commercial products like Gradle Maven Extensions appeared.


From: Tibor Digana <[hidden email]<mailto:[hidden email]>>
Date: Saturday, 14 Sep 2019, 9:43 PM
To: Maven Developers List <[hidden email]<mailto:[hidden email]>>
Subject: Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Alexander,
Enrico is really right. Today it is Microservices and there every
microservice is in a separate SCM repo.

It was just only an example with Microservices but in my experiences you
can always find the lower bound modules in the hierary which do not change
so much and segragate them in another SCM repos. Those should undergo the
release process, share release versions and avoid sharing SNAPSHOT
versions.

You can find the top roots which are actually applications. If you have 10
WAR files as a result of the build and all of them should be deployed, then
there is a strong reason to separate them in separate SCM repos.

Then this separation concept will guide you to isolate the middle layers
into isolated services as JAR files. And then you endup with Microservices,
SOA services and not JAR files or you will be much closer to them. the huge
monolith project is gone.

All the development process will be faster and more flexible than it was
before. Just try!

Cheers
Tibor17

On Sat, Sep 14, 2019 at 5:23 PM Alexander Ashitkin <[hidden email]>
wrote:

> HI Enrico
> Thanks for feedback. that's a side discussion for best approach for
> projects layouts. Monorepo has own own advocates and it is easy to find
> posts describing why google, microsoft or facebook go monorepo.
> Unlike of way of thought, we are ready to go globally in case of emergency
> scenario. If say zero-day vulnerability is discovered in some of low-level
> widely reused core libraries, we need just one click to build/test/deploy
> and safely go live globally with whole estate updated on scale of thousands
> of processes. And you know, there are people in the world who think that
> scattered across small repos codebase is difficult to maintain and
> snapshots are evil. It all depends.
> Honestly, i think it will be it's a kind of reversed approach them you
> build system defines how your software development processes work. Google
> has own vision and just implemented Bazel and this is correct approach. Btw
> Bazel is perfect for such scenario, but costly to migrate on for existing
> project.
>
> So if you choose monorepo as we did it is normal to work just on a part of
> project. You just need a way to deal with scalability challenges:
> a) you hit hardware and infrastructure limitations and need to address
> them in some way.
> b) need to have incremental build so you can work on subpart of project
> but contribute to shared codebase
>
> Sincerely yours, Aleks
>
> On 2019/09/14 08:41:37, Enrico Olivelli <[hidden email]> wrote:
> > I feel that in general having an huge monolithic project is kind of a
> > project-smell.
> > Btw I have some big project with 100+ modules so I can see your pain.
> > In the daywork experience a single developer doesn't work on all of the
> > modules but usually you touch 1-2 modules and maybe some
> integration/system
> > test.
> > If you need to rebuild the full project for every change maybe there is
> > something wrong with the overall design.
> >
> > I think you have you motivations for your layout, so let's talk about
> your
> > proposal.
> >
> > If you have a way to split your project in subsystems you can use some
> > shared remote repository for deploying snapshots in order to share
> > intermediate results with other developers
> >
> > If your goal is to be ready for releases I don't get your point. Usually
> > you work with snapshots and for a release you have to rebuild one time
> and
> > only once the full codebase in order to ensure that you a consistent
> build
> > of the project.
> > With all of this kind of temporary caches how do you ensure that the
> final
> > artifacts are the intended ones?
> >
> >
> > Beside note: this is not a real VOTE thread
> >
> > Just my 2 cents
> >
> > don't get me wrong, I admire your will to improve Maven ecosystem with
> this
> > cool feature! Thank you for contribution your work. We will try to get
> the
> > best
> >
> > Enrico
> >
> > Il sab 14 set 2019, 08:29 Laird Nelson <[hidden email]> ha scritto:
> >
> > > On Fri, Sep 13, 2019 at 11:01 PM Alexander Ashitkin <
> > > [hidden email]> wrote:
> > >
> > > > This feature is true incremental build – you don’t build modules
> which
> > > > were not changed at all and build only modified/changed ones.
> > > >
> > >
> > > Suppose module B depends on module A and I change A.  Does B get
> rebuilt in
> > > your system?
> > >
> > > Best,
> > > Laird
> > >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> For additional commands, e-mail: [hidden email]
>
>


---
This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received this e-mail in error) please notify the sender immediately and delete this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden.

Please refer to https://www.db.com/disclosures for additional EU corporate and regulatory disclosures and to http://www.db.com/unitedkingdom/content/privacy.htm for information about privacy.
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Maven incremental build for BIG-sized projects with local and remote caching

Tibor Digana
Hello Maximilian,

So now the next step is to break the traditional dependencies in Maven and
isolate the services via web-services, e.g. JAX-RS or JAX-WS and you would
not "touch" the POMs.
You need to use Logstash, Kibana, Elasticsearch, and Zipkin because the
logs won't be aggregated without these frameworks.
This would require you to spend some time and develop automatic deployment
and reliable CI.

The monolith would become on infrastructure level but not on code level.
There you can write integration tests in every service. The input XML/Json
received from another service can be a mock and mock data. The service and
it's project as well as the tests still become isolated on project level.
The tests would become a documentation, and the data (XML/Json) would be a
specification for another team.
In this position a particular functionality would appear on the right
place. Shared data won't become a workaround anymore. Sharing something may
easily happen in the monolith project.

The worst situation is if you share the database between the services
because there you really have to deploy many services.
One way is for instance an architecture where you have one NoSql database
for one webapp, and RDBMS as master data.
Each webapp has another NoSql database.
Then the services would read only from one NoSql and write to RDBMS master
data + JMS streaming the data back to NoSql databases via data/event bus.

It is more about infrastructure and such isolation.
Since every app has isolated database, then not all services have to change
only because a new feature required database migration to new tables and
relations.
The probabily of a change in the service would be smaller.

Then you have got DDD, CQRS but not the Event Sourcing - only partial.

Cheers
Tibor17


On Sat, Sep 14, 2019 at 9:35 PM Maximilian Novikov <
[hidden email]> wrote:

> Tibor,
>
> We understand your position.
>
> We moved from separated SCM to one SCM. We can move back, but we don't
> want this.
>
> In single SCM we like:
> 1. Atomic commits
> 2. Single point of responsibility.
> If someone makes incompatible change in shared library, he is responsible
> to update all usages. At first look It can be considered as slowness in
> development, but it helps us to avoid growing of technical debt. We never
> get in situation when projects A, B, C, D... depends on different version
> of shared library and we need to make major upgrade, it can block release
> of some apps and etc...
>
> Now we releasing 20+ clients apps and 50+ backend components every week or
> even often. With multiple SCM we will need to hire a team of release
> managers and build engineers to coordinate and support this.
>
> Again, we are don’t selling our approach. We implemented the missing for
> us feature.
>
> PS. Just thing why commercial products like Gradle Maven Extensions
> appeared.
>
>
> From: Tibor Digana <[hidden email]<mailto:[hidden email]>>
> Date: Saturday, 14 Sep 2019, 9:43 PM
> To: Maven Developers List <[hidden email]<mailto:
> [hidden email]>>
> Subject: Re: [VOTE] Maven incremental build for BIG-sized projects with
> local and remote caching
>
> Alexander,
> Enrico is really right. Today it is Microservices and there every
> microservice is in a separate SCM repo.
>
> It was just only an example with Microservices but in my experiences you
> can always find the lower bound modules in the hierary which do not change
> so much and segragate them in another SCM repos. Those should undergo the
> release process, share release versions and avoid sharing SNAPSHOT
> versions.
>
> You can find the top roots which are actually applications. If you have 10
> WAR files as a result of the build and all of them should be deployed, then
> there is a strong reason to separate them in separate SCM repos.
>
> Then this separation concept will guide you to isolate the middle layers
> into isolated services as JAR files. And then you endup with Microservices,
> SOA services and not JAR files or you will be much closer to them. the huge
> monolith project is gone.
>
> All the development process will be faster and more flexible than it was
> before. Just try!
>
> Cheers
> Tibor17
>
> On Sat, Sep 14, 2019 at 5:23 PM Alexander Ashitkin <
> [hidden email]>
> wrote:
>
> > HI Enrico
> > Thanks for feedback. that's a side discussion for best approach for
> > projects layouts. Monorepo has own own advocates and it is easy to find
> > posts describing why google, microsoft or facebook go monorepo.
> > Unlike of way of thought, we are ready to go globally in case of
> emergency
> > scenario. If say zero-day vulnerability is discovered in some of
> low-level
> > widely reused core libraries, we need just one click to build/test/deploy
> > and safely go live globally with whole estate updated on scale of
> thousands
> > of processes. And you know, there are people in the world who think that
> > scattered across small repos codebase is difficult to maintain and
> > snapshots are evil. It all depends.
> > Honestly, i think it will be it's a kind of reversed approach them you
> > build system defines how your software development processes work. Google
> > has own vision and just implemented Bazel and this is correct approach.
> Btw
> > Bazel is perfect for such scenario, but costly to migrate on for existing
> > project.
> >
> > So if you choose monorepo as we did it is normal to work just on a part
> of
> > project. You just need a way to deal with scalability challenges:
> > a) you hit hardware and infrastructure limitations and need to address
> > them in some way.
> > b) need to have incremental build so you can work on subpart of project
> > but contribute to shared codebase
> >
> > Sincerely yours, Aleks
> >
> > On 2019/09/14 08:41:37, Enrico Olivelli <[hidden email]> wrote:
> > > I feel that in general having an huge monolithic project is kind of a
> > > project-smell.
> > > Btw I have some big project with 100+ modules so I can see your pain.
> > > In the daywork experience a single developer doesn't work on all of the
> > > modules but usually you touch 1-2 modules and maybe some
> > integration/system
> > > test.
> > > If you need to rebuild the full project for every change maybe there is
> > > something wrong with the overall design.
> > >
> > > I think you have you motivations for your layout, so let's talk about
> > your
> > > proposal.
> > >
> > > If you have a way to split your project in subsystems you can use some
> > > shared remote repository for deploying snapshots in order to share
> > > intermediate results with other developers
> > >
> > > If your goal is to be ready for releases I don't get your point.
> Usually
> > > you work with snapshots and for a release you have to rebuild one time
> > and
> > > only once the full codebase in order to ensure that you a consistent
> > build
> > > of the project.
> > > With all of this kind of temporary caches how do you ensure that the
> > final
> > > artifacts are the intended ones?
> > >
> > >
> > > Beside note: this is not a real VOTE thread
> > >
> > > Just my 2 cents
> > >
> > > don't get me wrong, I admire your will to improve Maven ecosystem with
> > this
> > > cool feature! Thank you for contribution your work. We will try to get
> > the
> > > best
> > >
> > > Enrico
> > >
> > > Il sab 14 set 2019, 08:29 Laird Nelson <[hidden email]> ha
> scritto:
> > >
> > > > On Fri, Sep 13, 2019 at 11:01 PM Alexander Ashitkin <
> > > > [hidden email]> wrote:
> > > >
> > > > > This feature is true incremental build – you don’t build modules
> > which
> > > > > were not changed at all and build only modified/changed ones.
> > > > >
> > > >
> > > > Suppose module B depends on module A and I change A.  Does B get
> > rebuilt in
> > > > your system?
> > > >
> > > > Best,
> > > > Laird
> > > >
> > >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: [hidden email]
> > For additional commands, e-mail: [hidden email]
> >
> >
>
>
> ---
> This e-mail may contain confidential and/or privileged information. If you
> are not the intended recipient (or have received this e-mail in error)
> please notify the sender immediately and delete this e-mail. Any
> unauthorized copying, disclosure or distribution of the material in this
> e-mail is strictly forbidden.
>
> Please refer to https://www.db.com/disclosures for additional EU
> corporate and regulatory disclosures and to
> http://www.db.com/unitedkingdom/content/privacy.htm for information about
> privacy.
>