Status Update
Comments
mc...@ebay.com <mc...@ebay.com> #2
We are currently using AGP internal task types to flag memory-intensive tasks to enforce a reduced parallelism at execution time. I've raised this separately (with a lot more detail) as a feature request (
mc...@ebay.com <mc...@ebay.com> #4
Another use case that we have is to reactively respond to the creation of APKs and AABs. The new AGP APIs allow us to connect out tasks into the artifact pipeline via wiredWith
but the best we can come up with to receive the completed artifact is to wire in toTransform
. This A) does not guarantee that we will receive the final artifact as more transforms may be applied after our task is called, and B) requires us to copy the input property file/dir to our tasks output property file/dir in order to not break the build cache.
The reactive behavior of the above is the complicating factor.
A non-reactive approach could simply depend upon the task name and then look for a hardcoded path in the build directory (which is still sort of gross, since the build output paths are not documented as public API and change from time to time).
Another approach would be to wire a custom task to consume the output of the build via the built artifacts loader feeding an input property. However, this approach cannot be applied reactively. Either the custom task is included in the build and causes the creation of the binary artifact, or it is not included in the build and never gets invoked.
cm...@google.com <cm...@google.com> #6
We didn't provide a task wiring helper for that case as there's only one thing to wire, but I can see how the inconsistency can be misleading
mc...@ebay.com <mc...@ebay.com> #7
WRT variant.artifacts.get(SingleArtifact.APK))
, if the task is included in the build it will cause the creation of the artifact. Our build is currently defined to reactively perform some actions (predominantly some fancy reporting) only if work is actually performed.
We had previously been pushing our build to wire in to task outputs by locating tasks by type and referencing output properties as inputs to tasks registered via task finalizes
or dependsOn
relationships. This started getting more and more fragile as the AGP APIs migration proceeded/matured. I'm to the point now where I think the notion of reactive execution is hostile to the direction/expectations of both Gradle and AGP and want to start moving away from it, yet our build as it currently stands does rely on this behavior.
I bring up this up as a gap only because I don't know if I'll be able to completely refactor our CI pipeline's expectations in time for Gradle 8+.
xa...@google.com <xa...@google.com> #8
mc...@ebay.com <mc...@ebay.com> #9
Another minor functionality gap: We have a build that has test coverage enabled during test execution but then we manually disable the coverage report generation for all project modules as we have a custom coverage report task that creates an aggregate test coverage report for the entire project. This saves us the execution time, I/O, and protects us from Jacoco implementation instabilities.
We're currently using the following to accomplish this:
project.tasks.withType(JacocoReportTask::class.java) {
enabled = false
}
mc...@ebay.com <mc...@ebay.com> #10
Another gap, though my perhaps there's a better way to express this? Some of our builds leverage Flank to run instrumentation tests on Firebase Test Lab. These builds run as a single CI stage so as to afford Gradle the best opportunity to parallelize work. In this context, we have found that prioritizing instrumentation test assembly work early in the build allows the tests to dispatch to FTL earlier, minimizing overall build times. To implement this, we have chosen to be explicit on the inverse side by pushing lint and local unit test execution to be shouldRunAfter
the flank tasks which in turn depend on the instrumentation test assembly, etc.
Specifically:
private fun bumpFlankTask(project: Project, flankTasks: TaskCollection<FlankExecutionTask>) {
listOf(AndroidLintTask::class.java, AndroidLintAnalysisTask::class.java, AndroidUnitTest::class.java)
.forEach {
project.tasks.withType(it).configureEach {
shouldRunAfter(flankTasks)
}
}
}
This seems fairly specific to our project's desires and not necessarily transferable to other projects. I think our best option for the future Gradle 9+ might be to fallback to leveraging task names rather than leveraging task types in a generic fashion. Mentioning it here in case there is a better approach/option once the types are no longer available.
mc...@ebay.com <mc...@ebay.com> #11
Another gap we've found but no longer directly depend upon: when invoking BundleToStandaloneApkTask
the resulting universal APK does not appear to be accessible via the Artifacts API / ArtifactType.APK
- at least as of AGP 7.0.
We are able to no longer directly depend upon it because we are using the task name and a hardcoded build output directory path to locate the APK if/when it gets built. This is another symptom of our reactively defined build implementation. However, if we were to relay on
(phew! I think that's it for now? sorry for the dump, we're just starting to get caught up!)
xa...@google.com <xa...@google.com> #12
That last one (
mc...@ebay.com <mc...@ebay.com> #13
Ran into another use case that the API does not yet seem to support: AndroidUnitTest configuration for offline Jacoco instrumentation. Given that we've had to tweak task outputs to get this to work reliably with the build cache it is probably best corrected on the AGP side. Probably easiest just to paste the relevant code here:
/*
* -Djacoco-agent.destfile arg is used to configure the offline mode behavior of jacoco. The offline
* behavior is what is used when dependency module code is executed as it has already been
* instrumented by the jacoco-agent in previous executions. We redirect this to record the offline
* results under the build directory and give it a more explicit/identifiable name (it defaults
* to the project dir as jacoco.exec).
*
* NOTE: Attempts at using JacocoTaskExtension.setDestinationFile were unsuccessful in capturing coverage
*/
project.tasks.withType(AndroidUnitTest::class.java).configureEach {
val execFile = project.layout.buildDirectory.file("jacoco/offlineDependencies.exec").get()
jvmArgs("-Dfile.encoding=UTF-8", "-Djacoco-agent.destfile=${execFile}")
// Register our file as a task output to ensure it is restored via the build cache when execution is avoided
outputs.file(execFile)
doFirst {
// Make sure the coverage file is removed if it exists from a previous run
execFile.asFile.deleteRecursively()
}
}
je...@google.com <je...@google.com>
mc...@ebay.com <mc...@ebay.com> #14
And one question:
We have some convention plugin code which is applied to many project module. It uses the components extension's onVariants
callback to reactively trigger some data capture but needs to know whether or not minification has been enabled for the build type, configured by a separate convention plugin. We have code similar to the following:
project.plugins.withId("com.android.application") { plugin ->
val extension = project.extensions.getByType(ApplicationAndroidComponentsExtension::class.java)
val android = project.extensions.getByType(ApplicationExtension::class.java)
extension.onVariants { variant ->
...
if (android.buildTypes.getByName(variant.buildType!!).isMinifyEnabled) {
...
}
ApplicationExtension
feels like more of an input API for AGP and not something we should be programmatically querying within the onVariants
callback. Given the exposure of other configuration values (e.g. variant.pseudoLocalesEnabled
as Property<Boolean>
), should isMinifyEnabled
also be exposed?
je...@google.com <je...@google.com> #15
Alex, can you look at #11 first, then at #13.
je...@google.com <je...@google.com> #16
#14, yes, it should probably be offered in ApplicationVariantBuilder.
al...@google.com <al...@google.com>
je...@google.com <je...@google.com> #17
Alex and I looked a bit more carefully and we cannot make APK the result of the bundleToAPK task. The reason is that only one task can produce an artifact type at a time.
You cannot have an artifact type being produced by either the normal APK packaging task or the APKFromBundle task depending on what the user requested. In particular, if any script/plugin request the public APK artifact type, which task is supposed to run ?
The only way we can somehow satisfy #11 would be to have another public artifact type called APK_FROM_BUNDLE which would ensure that the bundle is created first, then the APK from the bundle. would that work ?
xa...@google.com <xa...@google.com> #18
We should really file separate bugs for all these comments. We can't just use a single bug for all of this.
xa...@google.com <xa...@google.com> #19
I filed the following specific bugs:
->comment #4 Issue 232323922 ->comment #9 Issue 232324065 ->comment #11 Issue 232325458 ->comment #14 Issue 232325329
I have not yet filed anything related to Jacoco, as we probably need to discuss things a bit internally first.
ag...@gmail.com <ag...@gmail.com> #20
We develop a convention plugin which adds some code quality tasks(detekt, checklist, lint) to the build, whenever assemble task is invoked. We achieved this on the old api by obtaining the assemble task provider with BaseVariant.getAssembleProvider() method then adding our tasks as a dependency to it. However, I couldn't find any equivalent method in the new Variant API. I thought of registering a custom Default task, which depends on SingleArtifact.APK and all of our other custom tasks, then host apps could use this new task on their local and CI machines to create apk. However, this method is not optimal since some developers still can run the assemble task directly and bypass our code quality tasks and also, this new task will not be executed when developers run their project on the Android Studio by default. So I would like a way to include some tasks to the artifact creation even if they do not produce or use this artifact. What I can suggest is a way to make artifacts depend on tasks similar to how tasks can depend on artifacts. For instance, something similar to the following would solve our use case:
variant.artifacts.get(SingleArtifact.APK).getTaskProvider().configure {
it.dependsOn("detekt${variant.name.capitalize()}")
}
Currently, what we are doing to solve this problem is the following:
project.extensions.getByType(AndroidComponentsExtension::class.java).onVariants { variant ->
project.afterEvaluate {
project.tasks.named("assemble${variant.name.capitalize()}").configure {
it.dependsOn("detekt${variant.name.capitalize()}")
}
}
}
which I know is not recommended. Therefore, could you consider adding a new api for this case?
Another thing is I couldn't find a way to obtain "lint" task for the current variant without tasks.named("lint${variant.name.capitalize()}"). Could you also create a method to obtain lint task for the current variant, add dependency tasks for it. Adding dependency for lint is important for us since we download our lint configuration with a custom task and we want this configuration to be ready when "lint" needs it. (This case could also be solved by using Provider API in the lint block of the finalizeDsl, however currently the "lintConfig" property is declared as File in this block)
And lastly, it would be nice if InternalArtifactType.JAVA_DOC_DIR was a public artifact. If our convention is applied to a library project, whenever we publish the aar, we also publish its javadoc(and kdoc) to a remote server. For this we need to add custom Javadoc task and configure it properly for the variant. However, I think, this process is a bit complex. I could not properly configure the custom Javadoc task using the new variant api. With the old API I was using the following configuration:
source = variant.getJavaCompileProvider().get().source
classpath += project.files(project.provider { androidExtension.bootClasspath })
classpath += project.files(variant.getJavaCompileProvider().map { it.classpath })
Therefore, making InternalArtifactType.JAVA_DOC_DIR public would greatly simplify our implementation and solve our problems. Currently, only solution I found was to add dependency to "javaDoc${variant.name.capitalize()}Generation" task and hardcode its output path as "intermediates${File.separator}java_doc_dir${File.separator}$componentName". But I know this is really fragile and would love to see an easier and more conventional way to accomplish this.
ww...@gmail.com <ww...@gmail.com> #21
I'm developing a plugin for external sources compilation into *.so
files.
How can I inject final *.so
files into the final APK/AAR? I saw that AGP has AndroidArtifacts.ArtifactType.JNI{_SHARED}
But I have no idea how to use it and can't find any samples. I also can't put SingleArtifact.APK
because it is transformable but not appendable.
variant.artifacts.use(task)
.wiredWith { it.outputSoFolder }
.toAppendTo(...) // <-- what to put here?
EXTRA: how to add their debug symbols to LLDB during debugging from the plugin (same as Makefiles/CMake does).
xa...@google.com <xa...@google.com> #22
AndroidArtifacts.ArtifactType
is internal and not meant to be used by our API. The thing to pass to toAppendTo
would have to be a MultipleArtifact
but we don't expose many of them yet, and none that are useful for your use case.
At some point we may expose the intermediate artifact that is the final folder of all the .so
files, but that may not be what you want either. If it's the final folder, then it's a single folder, so you cannot append to it, and you can only transform it (which means taking the content, processing it, and writing the output in a different folder). This is not efficient when you want to add new files to it.
So, your use case actually is better positioned to use sourcesets rather than inject in an intermediate. We recently introduced
mc...@ebay.com <mc...@ebay.com> #23
I just ran into a need to modify android test manifests in my convention plugin. I intended to use the artifacts API to do this but it looks like there is no SingleArtifact.*
making this available. As per the AGP 7.2 docs on MERGED_MANIFEST
: "For each module, unit test and android test variants will not have a manifest file available.". Can we please get these added as well?
dr...@gmail.com <dr...@gmail.com> #25
We are currently using the old AndroidSourceSet
APIs to configure Checkstyle and Detekt for Android projects, as described in this issue:
As far as I can tell, this is not yet covered by the new APIs and this would likely apply to other static analysis tools that need to process source files as well.
dr...@gmail.com <dr...@gmail.com> #26
I created a new ticket talking about this AndroidSourceSet
use case here:
he...@amazon.com <he...@amazon.com> #27
My project currently uses the javaCompileProvider
and preBuildProvider
APIs of the BaseVariant
class.
Fir the javaCompileProvider
case, we use this mainly in Application projects. We have some custom code generation tasks that we run that require the classpath of the application be available (so we use the output of the task), and that we want to be done "at the same time" before tasks that depend on JavaCompile complete have the code we generate ready as well. Example usage is:
android.applicationVariants.configureEach { variant ->
val customTaskOne = project.tasks.register("customTaskOne${variant.name.capitalized()") {
dependsOn(variant.javaCompileProvider, kotlinCompileTask)
}
val javaCompileOutput = variant.javaCompileProvider.get().destinationDirectory.get().asFile
val codeGenerationAction = CodeGenAction(javaCompileOutput, variant)
variant.javaCompileProvider.configure {
finalizedBy(customTaskOne)
doLast(codeGenerationAction)
}
}
I understand this is a bit janky, but I'm looking to update the code and make sure we're doing things "The Right Way(tm)" going forward. If there's a similar way to accomplish what we're looking for, I'd love to know about it, especially if it's a tool or technique that I'm not familiar with.
For the preBuildProvider
usecase, we're essentially generating some code early on in the process that just needs to be ready. I can likely use some other method of having this task run (as it doesn't really depend on anything else other than being done before the APK is packaged). What would be the suggested way with the new gradle-api
classes to perform this sort of work?
EDIT: I've spent the last few days looking through the .class
file for inclusion in the dex or generate a JSON file for inclusion in the application's assets
.
For reference, all of this is with AGP 7.4.
I tried using the same task for both to see if AGP would know how to handle that:
variant.artifacts
.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toAppend(
ScopedArtifact.CLASSES,
ScannerTask::output
)
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toGet(ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories)
but that led to things just not executing. I didn't see anything with the task name in the --debug
output. So, I went ahead and tried using two tasks: one for toGet
that would scan all the input and generate the .class
file, and one that would then take that output of that scan task and then add it using toAppend
. Attempting to do this led to a circular dependency, leading me to believe that toGet
is ALWAYS executed last.
So, I went ahead and tried using toTransform
:
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toTransform(
ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories,
ScannerTask::output
)
And that worked! The class was generated, and included in the dex
file. The problem was that the API was expecting me to essentially touch every input file and then add them to the output. That sounds like it's going to kill my build times.
Am I on the right track here and maybe just missing an API to use? Or is this use case not supported by the current APIs?
je...@google.com <je...@google.com> #28
You are correct, the toTransform
is the only API you can use in your case because you are trying to get the final version of the artifact in your scannerTask
while also trying to append (from the same Task). Even if you used 2 tasks, you would end up in a circular dependency.
You are also correct this is not going to be great for your build time.
One of the way I can think of would be to make a new version of toTransform
that would be a lot smarter and allow you to tag unchanged jars/directories. I think that would solve your case completely ?
But in the meantime, maybe using a KSP or plain old annotation processor might be another solution, not exactly sure about your constraints.
he...@amazon.com <he...@amazon.com> #29
I'm sure I could get that to work, having a sort of incremental toTransform
, though I don't think that would be the ideal solution for my particular use case. My attempt to use toTransform
was based off it allowing me to essentially take a look at every class that's going to end up in the package. I'm not looking to actually modify any of the classes that I scan. With the API you suggested, we would essentially be marking everything we look at as "unchanged", and then attempting to add a new class/asset based on what we saw, or perhaps modifying one or two classes/assets by adding the results of our scans.
My end goal is to essentially "append" to the output using everything previously built as an input. So, maybe not "append", but almost "finalize". That's why we currently use the finalizedBy
and doLast
APIs for the JavaCompile
task. I think my big concern would be using the "transform" API in a way that isn't strictly "transforming". If that's not one that you share, then this certainly would be worth trying out.
But in the meantime, maybe using a KSP or plain old annotation processor might be another solution, not exactly sure about your constraints.
That's certainly one of the avenues I'm investigating. I'm just trying to make sure I've taken a look at and understand all of the options that are available.
xa...@google.com <xa...@google.com> #30
The problem of a finalize
API is that only one thing can do it. If we expose this as a proper API, then we have to make sure only 1 plugin can do it and fail if 2 plugins try to do it. If we start having several published plugins using that APIs, they will not be compatible with each other.
This is really not a path we want to go down at the moment.
So "transforming" but not actually touching the files is perfectly fine (as long as you do copy them into the output), though you have to realize that the API cannot guarantee that you are last. You have to manage your plugin application order and hope your transform is added last.
he...@amazon.com <he...@amazon.com> #31
The problem of a finalize API is that only one thing can do it. If we expose this as a proper API, then we have to make sure only 1 plugin can do it and fail if 2 plugins try to do it. If we start having several published plugins using that APIs, they will not be compatible with each other.
Oh yeah, I absolutely understand the turmoil adding an API like that can cause, especially down the line. I don't blame you at all for not wanting to codify that potential nightmare in the public API. If the "transform" API is built and maintained in such a way that it accounts for folks not always actually wanting to transform classes, I think that would be sufficient.
you have to realize that the API cannot guarantee that you are last.
That's okay. No external dependencies should be using these, as they're strictly internal. We also don't need to worry about other internal plugins using/generating classes that we would be required to scan. We don't need to be "last" as much as we need to be "after compilation but before packaging", which this API would provide us.
je...@google.com <je...@google.com> #32
I have been thinking about this a bit more and it's actually not easy to provide an API where you can identify some untouched inputs as outputs.
The main reason is that Gradle will complain if 2 tasks output the same file/directory, so one way of another, we must copy the inputs into outputs which is probably what you already do. At least we would save the merging step which is an improvement but there would still be a fair amount of I/O.
he...@amazon.com <he...@amazon.com> #33
Would it be possible to have a "Read Only" API that allows scanning/reading of the non-generated code for the project which would be followed by tasks that perform this code generation/modification? Something like
variant.artifacts
.forScope(ScopedArtifacts.Scope.ALL)
.scan(scannerTask)
.andOutput(writeTask)
.with(
ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories,
WriteTask::output
)
So, my scannerTask
would be responsible for running over the classes, and building up the manifest that it wants to generate. Then, the writeTask
would take that manifest and generate code to the output
directory. This way you have a very clear set of processing at output tasks. On my side, I could keep the manifest in memory, since I expect the tasks to be run in a pair (as in, writeTask
wouldn't run if scannerTask
hasn't). If it's better practice, scannerTask
can be used as an input for writeTask
.
I can see the scan
API being useful for any sort of processing on the APK that needs to be done, including any sort of reporting folks might want. It can allow classes to be scanned, but not necessarily modify the output. However, if andOutput
were included, the writeTask
would be added alongside the other code modifying tasks. I think the order here can make sense if the scannerTask
were always to operate on the non-generated, non-modified classes.
I'm not sure if that all makes sense as I'm spitballing.
je...@google.com <je...@google.com> #34
you are still introducing a circular reference, as the Scanner Task wants to have access to all the final CLASSES and generate a manifest that the WriterTask would use to generate a new element of the CLASSES artifact. The fundamental issues here are :
- provide an API that allows to transform but mostly leaving original items unchanged.
- be independent of the Plugin apply order.
mostly something like :
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL) .use(scannerTask) .toGetAndAdd( ScopedArtifact.CLASSES, ScannerTask::allJars, ScannerTask::allDirectories, ScannerTask::output )
but that means you are not guaranteed to have the final version of CLASSES as some other Plugin may add a folder after you...
he...@amazon.com <he...@amazon.com> #35
That's why I was trying to phrase it as "non-generated", but I'm not sure how useful that would be outside of my specific use case (and obviously nobody wants to support an API for some weird one-off). For our project, what other plugins might do work after we perform ours doesn't matter, as I can guarantee for my project, in this instance, that things will behave as I expect.
If this is something you think might be worthwhile to add, with the above caveats, that would be swell. We'll likely use it in some form. If it's not something that seems like it would be worth supporting, which is not unreasonable, I'm sure I'll figure out some other solution before the time comes to migrate. Necessity is the mother of invention, after all.
I appreciate the discussion and consideration.
da...@gmail.com <da...@gmail.com> #36
Hi,
I wrote a plugin that generates a wrapper around string resources, so I had a mockable class while still being able to use string resources in viewModels etc.
The way I did this before was to add a dependency on the process<...>Resources task, find the R.jar file, unzip and use a class visitor to extract the string/plural names. Then use codegen to create the wrapper with this data. Then I had to add my task as a dependency on compile<...>Kotlin.
It's taken me a while to get to grips with the new system, just trying to slot my task to run at the correct time, but I've ended up with something like the following:
project.plugins.withType(AppPlugin::class.java) {
val androidComponents = project.extensions.getByType(AndroidComponentsExtension::class.java)
androidComponents.onVariants { variant ->
variant.sources.java?.let { sources ->
val generateStringsTask = project.tasks.register(
"generate${variant.name}Strings",
GenerateVariantStringsTask::class.java,
) {
it.variantPackageName.set(variant.namespace)
it.outputDir.set(project.layout.buildDirectory.dir("generated/source/stringrepository/${variant.name}"))
}
variant.artifacts
.use(generateStringsTask)
.wiredWith(GenerateVariantStringsTask::symbolsFile)
.toListenTo(SingleArtifact.RUNTIME_SYMBOL_LIST)
sources.addGeneratedSourceDirectory(
generateStringsTask,
GenerateVariantStringsTask::outputDir
)
}
}
}
I'll admit, this new way is a hell of a lot easier to get the data I need, The problem is that the amount of exposed artifacts is pretty limited (at least compared to what I've seen for InternalArtifactType).
RUNTIME_SYMBOL_LIST includes transitive dependencies, breaking my generated code as the module's R class can no longer 'see' those transitive resources (with the default APG 8+ non transitive R class setting).
Ideally having a non-transitive version of that artifact (to the local_only_symbol_list file?) would be perfect, but considering my confusing on picking up these new APIs there's a good chance i'm wrong on approach for this.
Any guidance would be very much appreciated.
Thanks
je...@google.com <je...@google.com> #37
#36, I filed
Description
As part of the Android Gradle plugin team's plan to help developers more easily upgrade to newer versions of the Android Gradle plugin, we need to migrate plugins and build scripts off using internal implementation details of the plugin. See the Gradle plugin roadmap for more details about our planned timeline.
This issue aims to capture use cases that are not currently supported by the APIs of the Android Gradle Plugin.
If you have a use case for extending your Android app or library build that is not covered by the APIs that are available in the com.android.tools.build:gradle-api maven artifact, such as directly reading or modifying the tasks registered by the Android Gradle plugin, please comment here, explaining what you're trying to achieve.
For example, explaining "I have a custom static analysis tool to run locally and on CI that that needs all the shrinking configuration, and the final APK as an input, and at the moment I'm maintaining a custom task that consumes these intermediate files as inputs" is more helpful than just stating the intermediate files without the bigger picture about what you want to use them for.
Even if your use case is similar, but not identical, to one already posted, please include it here, to make sure we're aware of your specific use case.
We're planning to remove the old APIs in Android Gradle plugin 9.0 (Mid 2023), and we want to minimise the disruption caused by that as much as we can.
Thanks for your help in improving the build experience for all developers!