https://blog.gradle.org/preview-avoiding-task-configuration-time
This issue is intended to collect community feedback about the new Task API that will replace the existing way tasks are created and configured in Gradle build scripts and plugins.
In particular, we're looking for:
If you believe that you have encountered a bug, feel free to link to it here, but we would prefer to track these as separate issues, since we will prioritize fixing buggy behavior over new features.
In terms of documentation, I was checking whether a task already existed using findByName("taskName") == null
; the doc says there's no equivalent to findByName
, which is true, but for this specific case I could just use tasks.getNames().contains("taskName")
instead. getNames()
might be worth mentioning in the docs.
See https://github.com/tbroyer/gradle-apt-plugin/commit/432509ec85d1ab49296d4f9b21fad876523c6a8a#diff-80490463545f2ead7e8427b5597a7a0bR192 for that specific example.
Thanks @tbroyer.
In your example, how would eclipseJdtApt
or eclipseFactorypath
be created before that plugin? Would it be from another plugin?
Ha, good point! Honestly, this is a mix of cargo-cult and (probably overly) defensive programming, in case, I don't know, the plugin being applied twice or things like that. If that's not needed (and it might be the case), let me know, I'll consider removing those checks then.
I've had a go this evening at migrating my Refaster plugin to the new Task API (see branch https://github.com/jbduncan/jupiter-collection-testers/tree/migrate-task-configuration-avoidance), but when I run ./gradlew help -Dorg.gradle.internal.tasks.stats --scan
, the build scan shows that all tasks my plugin produces are still eagerly created during configuration.
My knowledge of Gradle plugin development is still very much at the beginner level, so I've no idea how to diagnose the problem and figure out if I'm just programming things poorly (very likely!) or if the existing APIs aren't good enough.
Any thoughts would be appreciated!
I should have added that this Refaster plugin I talk about is a buildSrc
plugin and is located under https://github.com/jbduncan/jupiter-collection-testers/tree/migrate-task-configuration-avoidance/buildSrc. :)
Hi @jbduncan, your problem is here:
This construct forces eager creation of tasks, and (IIUC) the tasks added by your plugin are JavaCompile
tasks. You need to change that to:
tasks.withType<JavaCompile>.configureEach {
Thanks @ldaley! That partially fixes my problem.
I diagnosed things further, and I think I've managed to figure out that https://github.com/tbroyer/gradle-errorprone-plugin is forcing eager construction of the JavaCompile
tasks that my plugin lazily registers (see https://github.com/tbroyer/gradle-errorprone-plugin/issues/73).
So I think that there's nothing else that can be done until either I'm proven wrong or gradle-errorprone-plugin is fixed.
Thanks again!
@tbroyer Plugins normally can't be applied twice to the same project and I think you can assume that's not going to happen.
My main hesitation to suggesting a replacement for findByName
is that it makes the plugin very sensitive to ordering. It's a bit of a smell if reversing the order of plugin application completely changes the behavior of a build/plugin.
Thanks @big-guy, removed the checks and now unconditionally create/register the tasks (https://github.com/tbroyer/gradle-apt-plugin/commit/3146246?w=1)
One thing I have been experimenting with using the Property
/Provider
APIs was to be able to attach an output to a domain object. Then, consumers can use that domain object if they want to use the outputs without needing to know what tasks build that output.
Here is a contrived, pseudo-example:
// imagine a plugin that does something like the following
data class MyThing(
private val myName,
val myBinary: Provider<RegularFile>, // a thing that gets produced by the tasks
val binDir: Provider<Directory> // another output destination
) : Named {
override fun getName(): String = myName
}
val myThings = project.container(MyThing::class.java) { name ->
val t1: MyTask1 = tasks.create(...)
val t2: MyTask2 = tasks.create(...)
MyThing(name, t2.someFileOutput, t1.binDir)
}
val extensionType = object : TypeOf<NamedDomainObjectContainer<MyThing>>() {}
extensions.add(extensionType, "myThings", myThings)
// some user land code
val myNamedThing by myThings.creating
// presuming support of something like this in the future
tasks.register("userlandTask", Exec::class.java) {
executable(myNamedThing.myBinary)
...
}
This seems to work out alright today, although I don't know if it the best approach.
Now if I try and change the factory creation to instead use register
it becomes extremely difficult to maintain the API of NamedDomainObjectFactory
:
val myThings = project.container(MyThing::class.java) { name ->
val t1: TaskProvider<MyTask1> = tasks.register(...)
val t2: TaskProvider<MyTask2> = tasks.register(...)
// MyThing(name, t2.someFileOutput, t1.binDir) // ERROR! wrong types
MyThing(
name,
t2.map(MyTask2::someFileOutput).get(),
t1.map(MyTask1::binDir).get()
) // this forces configuration of the tasks
}
That is just some of my thoughts around how I have been experimenting with the different APIs. We have some cases where we do whenObjectAdded {}
/all {}
configurations where we create tasks for each domin object, but we also want to be able to refer to the outputs throughout the task chain by only using the domain object instead of looking up the task that produces that output. These may not be the intended uses, but I'm still looking for the appropriate way to model something like this, especially when there are a lot of tasks that are used create those outputs.
@mkobit Regarding your (albeit contrived) example, couldn't you .map(Provider::get)
to turn the Provider<Provider<RegularFile>>
into a Provider<RegularFile>
? (rather than .get()
like you did here)
@tbroyer I think that works in some cases, but possibly not others. The main ones that come to mind are things like DirectoryProperty
or RegularFileProperty
. For example, if I want to have a configurable output destination, but also be able to refer to it from tasks.
If we change the previous example a little bit:
// imagine a plugin that does something like the following
data class MyThing(
private val myName,
val myBinary: RegularFileProperty, // configurable file output destination
val binDir: DirectoryProperty // configurable directory output
) : Named {
override fun getName(): String = myName
}
val myThings = project.container(MyThing::class.java) { name ->
val t1: MyTask1 = tasks.create(...)
val t2: MyTask2 = tasks.create(...)
MyThing(name, t2.someFileOutput, t1.binDir)
}
// ...
// some user land code
val myNamedThing by myThings.creating {
myBinary.set(...)
binDir.set(...)
}
// presuming support of something like this in the future
tasks.register("userlandTask", Exec::class.java) {
executable(myNamedThing.myBinary)
...
}it up
In task.create
case the task's RegularFileProperty
/DirectoryProperty
can be passed right into the constructor and can be used in user code and the inferred task dependencies and what not _should_ work (I think). In the task.register
case, the type that is given back is TaskProvider<MyTask1>
, and map
ping it would produce something like Provider<DirectoryProperty>
. Calling Provider.get
on that gives you the expected DirectoryProperty
, but I believes "realizes" the task. Calling .map { it.get() }
gives back a Provider<Directory>
instead.
I don't know how realistic this case is and it might be splitting hairs. I'm sure there are kinds of workarounds can be applied to it, but thought it might be helpful to type some of this up.
I may be overthinking this (or not thinking enough), but I think your suggestion makes sense for some prototyping.
Hi.
Can one suggest how to convert this code:
`
sourceJarTask = tasks.register("sourceJar", Jar) {
....
}
publishing {
publications {
mavenJava(MavenPublication) {
...
artifact sourceJar {
classifier "sources"
extension "jar"
}
}
}
}
`
The problem is that artifcat 'task-name' causes it to immediately created.
Tried to provide TaskProvider to 'artifact' with no luck
Thanks
Boaz
@boaznahum Not all plugins have been migrated to use lazy tasks yet (as of Gradle 4.9). I sent a PR for the java-gradle-plugin
(#6115), and maven-publish
also still uses the old API, creating tasks eagerly (I started converting it, but have difficulty making the unit tests pass –lots of mocking and stubbing and relying on internals / implementation details)
@tbroyer Thanks. Meanwhile is there a trick to do the publication lazy ?
(We have a serious problem of configuration time, if we long wait for 4.9 new API)
@boaznahum We haven't made it to fixing up publishing yet, so there's not an easy/public way for avoiding the jar task from being created.
You could try creating a build scan with your build (either with our public instance or running your own):
https://docs.gradle.com/enterprise/get-started/#analyze_build_performance
This will help you narrow down where your configuration time is being spent.
Is it legal to add a doLast under the configure method? like so:
tasks.register("clean", Delete) {
delete 'aFolderPath'
doLast{
println "Legal?"
}
}
when i am doing so the task is always executing - when i remove the doLast it turn to work properly
I try with the Copy task and the Up-to-date mechanism work properly.
@jgafner Yes, you can configure a task like normal.
What you're seeing happens with both the old way of creating tasks (tasks.create(...)
) and the new way (tasks.register(...)
).
For Delete
tasks, there are no known inputs/outputs to the task, so Gradle relies on the task reporting if it did work or not. The main action for Delete
is to delete files/paths. If everything has been deleted, the task action reports "nothing to do" back to Gradle. Each action (doLast
or doFirst
) you add to the task resets that status. So since doLast { println "Legal?" }
doesn't report "nothing to do" again, Gradle counts the task as having done something.
For other types of tasks (like Copy
), Gradle can keep track of the inputs/outputs of the task. In that case, the task may be up-to-date when it has extra actions (doLast
or doFirst
) because the inputs and outputs of the task have not changed.
Is there a way to filter the lazy collection? tasks.withType<JavaCompile>().filter { ... }.configureEach { }
? You could most likely just do it in the configureEach
block with an if
block, but seems like a filter { }
type method might make sense.
@mkobit What would be passed to the filter
-method, a Task
instance?
@big-guy I think so - I was looking to make the https://github.com/google/protobuf-gradle-plugin better for the kotlin-dsl and noticed they have several APIs build on top of querying the task container with some filter applied. See https://github.com/google/protobuf-gradle-plugin/blob/42d741c1f32cd2551329f2f39ba374d59338267e/src/main/groovy/com/google/protobuf/gradle/ProtobufConfigurator.groovy#L131-L148
If the filter needs a Task
instance, then we have to create (and configure) the task, which defeats the purpose of trying to avoid that :)
You can get that kind of filtering with matching {}
already. That plugin code looks like it has some ordering issues since it uses findAll
. tasks.withType(GenerateProtoTask).matching { /* whatever filter */ }
would be the best replacement, returning TaskCollection
instead of Collection<GenerateProtoTask>
and recommending that people use configureEach {}
or all {}
instead of each {}
would go a long way.
Thanks @big-guy - matching
looks like exactly what I want. Changing to TaskCollection
was my plan too, thanks for the reinforcement.
I think the plugin appears like it has ordering issues, but as long as you stick close to the examples it stores some of the configuration Closure<>
in a List
and does some afterEvaluate
configuration.
Thanks!
No problem @mkobit. I'd start small with the changes you've described, but I think it should be possible to get rid of all of the special handling (the list of closures).
Further down the line...
I would also suggest that the task mutation methods should actually be more like:
void ofFlavor(String flavor, Action< GenerateProtoTask> action) {
project.tasks.withType(GenerateProtoTask).matching { it.flavors.contains(flavor) }.all(action)
}
On the DSL side, the build script would change to:
protobuf {
...
generateProtoTasks {
ofFlavor("demo") {
// some configuration for 'demo' flavored tasks
}
}
}
To make the plugin more Kotlin (and Java) friendly, you'd also want to replace all of the methods that take a Closure
with the same method taking an Action<delegate type>
. And then change where those classes are created to use ObjectFactory
instead of just new
ing instances. This gives Gradle a chance to decorate the classes and make the DSL consistent across Kotlin/Java/Groovy.
e.g., this new
should go away: https://github.com/google/protobuf-gradle-plugin/blob/42d741c1f32cd2551329f2f39ba374d59338267e/src/main/groovy/com/google/protobuf/gradle/ProtobufConvention.groovy#L41
And be replaced with something like project.objects.newInstance(ProtobufConfigurator, project, fileResolver)
.
@big-guy regarding https://github.com/gradle/gradle/issues/5664#issuecomment-408987405 , is there a separate issue for publishing/producing outgoing artifacts?
@mkobit Yes, there are a couple of low level issues we're tracking on another board (because our team is responsible for the feature):
https://github.com/gradle/gradle-native/issues/730 (fixing things internally)
https://github.com/gradle/gradle-native/issues/723 (providing a public API)
What is the expected implementation pattern for a plugin that want to register tasks for user registered domain objects?
myContainer {
register("hello")
}
From the plugin side, how should I be handling task registration for a user registered domain object? A configureEach
doesn't seem (at first glance) to work correctly, so do I need to "force" the configuration to be realized for each item (like whenObjectAdded
) in order to react to it?
For example:
plugins {
base
}
tasks {
named("wrapper", Wrapper::class) {
gradleVersion = "5.0-milestone-1"
}
}
class MyThing(private val name: String) : Named {
override fun getName(): String = name
}
open class MyPlugin : Plugin<Project> {
override fun apply(target: Project) {
target.run {
val myThingContainer = container { thingName ->
println("Creating $thingName")
MyThing(thingName)
}
extensions.add("myThingContainer", myThingContainer)
myThingContainer.configureEach {
tasks.create("createdTaskFor${this.name}")
tasks.register("registeredTaskFor${this.name}")
}
}
}
}
apply<MyPlugin>()
configure<NamedDomainObjectContainer<MyThing>> {
register("first")
create("second")
}
Only tasks are created and registered for create("second")
We are proud to announce that a new community slack channel has been created to discuss and provide feedback about the new Task API for configuration avoidance. From the gradle-community.slack.com, proceed to the #config-avoidance
channel.
@mkobit Your observations are right. If you use the new API with your container, you won't see the tasks unless your container is realized. The simple answer for now is to use the eager API (.all(Action)
or .withType(Class, Action)
or .whenObjectAdded(Action)
) with the other container but still use the new API for registering tasks. We want this to work well with tasks first and have only done a bare minimum on the other types of containers.
I've raised this internally and we decided to continue with our validation and documentation route. I'll revisit this with the team for 5.0
tasks.withType<JavaCompile>.configureEach { }
is a very long API call when before developers could just do tasks.withType<JavaCompile> { }
. Is there some way that the API can be simplified?
In my build I've declared this just to simplify all the call sites to be less verbose.
fun <T> NamedDomainObjectCointainer<T>.withTypeLazy(action: Action<T>) =
withType<T>().configureEach(action)
Another affected area from https://github.com/gradle/gradle/issues/5664#issuecomment-428616200 is nested containers. I think having the ability to react to registrations instead of object creation is needed, especially when needing to register additional elements in other containers.
What is the suggested way to replace a lazy task action? Since it was mentioned to avoid for example: tasks.replace (‘compileJava’)
Also in case we do replace with the replace method, does it keep the task replaced configuration values and property or it clean everything and it need to define inputs and outputs from scratch ?
Thanks
Jonathan
What is the suggested way to replace a lazy task action?
I don't think there really is one (I may be wrong). What's the use case for needing to replace the compileJava
task?
we need to run the java compile with the IntelliJ javac2 extension for backwards compatibility
With the current build system (ant) we define a taskdef to configure the javacompile
We are doing something similar to this:
But we converted all our new BS (gradle) to the new api and now we are looking for the right solution
So the solution in that stack overflow post doesn't actually replace the compiler, it merely uses the javac2 compiler to instrument (I'm guessing that's similar to aspectJ's post-compile weaving) the class files.
This won't require replacing the compileJava
task, thankfully. It looks like the suggestion in that SO post will probably work. You're probably going to need to configure it so that it also instruments your test code as well.
We need to replace the action task for a bunch of subprojects.
We currently use the deprecated method deleteAllActions() at the configuration phase.
I understand from your answer that currently there isn’t a way to overwrite an action task declared with the new API right?
@Jonatha1983 The replace
method was unsafe in a lot of ways. We are deprecating the replace
for realized tasks which is how you currently use it and will support a narrow, safe, set of scenarios where replace can work for unrealized tasks. Given that, it's much safer to disable the task you want to replace and add a dependency to the task you want to replace with. If you want to keep the configuration UX the same (meaning configuring compileJava
would configure your task), it is recommended to use the Provider
API in your new task and simply returns the values from the compileJava
task of your interest. You would effectively "link" both task together where modifying compileJava
would modify your custom task.
Overwrite an action is also deprecated for caching purpose and little to no valid use case exists for such feature.
It looks like https://github.com/gradle/gradle-native/issues/966 will bring the lazy task API out of incubation - are there any additional issues that should be created from this one for features or other work that will not be included in this release or not included at all so that this one can be closed? For example, in https://github.com/gradle/gradle/issues/5664#issuecomment-431033342 I think it makes sense at some point to offer an API similar to configureEach
but allows for users to react to registrations rather than when the object itself is created.
One of the things that I've noticed when looking at peoples code snippets posted asking for help on Slack is, generally there is very little awareness of the difference between these two methods.
tasks.withType<WhateverTask> { }
tasks.withType<WhateverTask>().configureEach { }
Having to explain to people why the should use configureEach
is usually unhelpful because it's outside of the scope of whatever problem they have.
Most people see the second, see that it uses significantly more code and won't bother using it.
Is there some way that the API can encourage users to use it? Currently, it just adds a lot of verbosities that is confusing for new users.
I think that some serious thinking about how to make this API more friendly and intuitive needs to happen in order to not create API confusion for people.
We could roll configureEach(Class, Action)
to avoid some confusion. @adammurdoch and @big-guy any thought on that? It seems like a valid concern and BinaryCollection
does have it.
Couldn't it also be solved in the long run: add an eager API (if it doesn't already exist, e.g. all { … }
), then at one point switch the behavior of the withType(Class, Action)
to be equivalent to configureEach
rather than all
.
@lacasseio That's not a bad idea and we do want to bring BinaryCollection
closer to the other collection types. We should discuss this next planning.
@JLLeitschuh We're not planning on broadly changing the API at this point, but there are a few things we could do:
I think we shouldn't emit deprecation warnings until 6.0 (for removal in 7.0) because the cost of the deprecation warnings may be too high and many builds won't be able to do anything about it if the method is coming from a plugin.
I think we could revise the documentation to more strongly favor the new APIs. And we definitely need more work to be done around plugin development and explaining a lot of the new features (Provider API and configuration avoidance go hand-in-hand). I see this API and its adoption as primarily a plugin author effort than individual build scripts (although, that's important too), so getting good documentation out is key.
@mkobit We have a whole epic of "configuration avoidance" issues (organized on the gradle-native board because we were the team doing the work at the time) that hit around that:
This is high on my priority list to tackle when we start the next batch of work, but lower than improving Provider API adoption/use inside gradle/gradle. I'm on a bit of a crusade to get rid of the need for afterEvaluate
.
@tbroyer I think we have to support the old behavior/APIs for a long time, but that's a possibility. I also thought about having an opt-in feature preview that would switch the Groovy DSL:
task foo(type: Foo) { }
from eager to lazy. This could be switched on at a major version bump without impacting published plugins. This would make the Groovy DSL more similar to the Kotlin DSL behavior.
As @mkobit noticed, we're bringing these APIs out of incubation in 5.1.
For the most part, the APIs are the same since 4.9. There have been a few additions (more or less convenience methods) and some tightening of restrictions. If you limit yourself to just the APIs available in 4.9 and the restrictions in 5.0, plugins should work the same across the range (4.9-5.1).
Thanks for everyone's feedback here. It's been really helpful.
Most helpful comment
@lacasseio That's not a bad idea and we do want to bring
BinaryCollection
closer to the other collection types. We should discuss this next planning.@JLLeitschuh We're not planning on broadly changing the API at this point, but there are a few things we could do:
I think we shouldn't emit deprecation warnings until 6.0 (for removal in 7.0) because the cost of the deprecation warnings may be too high and many builds won't be able to do anything about it if the method is coming from a plugin.
I think we could revise the documentation to more strongly favor the new APIs. And we definitely need more work to be done around plugin development and explaining a lot of the new features (Provider API and configuration avoidance go hand-in-hand). I see this API and its adoption as primarily a plugin author effort than individual build scripts (although, that's important too), so getting good documentation out is key.
@mkobit We have a whole epic of "configuration avoidance" issues (organized on the gradle-native board because we were the team doing the work at the time) that hit around that:
This is high on my priority list to tackle when we start the next batch of work, but lower than improving Provider API adoption/use inside gradle/gradle. I'm on a bit of a crusade to get rid of the need for
afterEvaluate
.@tbroyer I think we have to support the old behavior/APIs for a long time, but that's a possibility. I also thought about having an opt-in feature preview that would switch the Groovy DSL:
from eager to lazy. This could be switched on at a major version bump without impacting published plugins. This would make the Groovy DSL more similar to the Kotlin DSL behavior.