Status Update
Comments
mc...@ebay.com <mc...@ebay.com> #2
cm...@google.com <cm...@google.com> #3
mc...@ebay.com <mc...@ebay.com> #4
cm...@google.com <cm...@google.com> #5
cm...@google.com <cm...@google.com> #6
Android Studio version: 0.8.12
buildToolsVersion 21.0.1
Gradle 1.11
mc...@ebay.com <mc...@ebay.com> #7
--set-max-idx-number=<value>
Unfortunately changing the default is not a solution since the linearAlloc limit can be reached at very different levels depending on the classes hierarchy and other criteria.
In addition for most applications, moving to multidex will only help to workaround the linearalloc limit for the installation. But the application will still crash against the same limit at execution. The only working use case where I know multidex can help with linearalloc is when the apk does not contains one application but distinct pieces running in separate process.
xa...@google.com <xa...@google.com> #8
It's nice to know about that command line option. I do not see it in the output of 'dx --help', might be good to add that.
I'm not very familiar with the 'linearAlloc limit' issue outside of the context of the dexopt step. My sample app is able to run once the lower idx value is set, although I do not actually call into any of the library code that is bundled with the app. I assume it's undefined when/if the 'linearAlloc limit' will be hit in a large application on gb.
I'm a bit confused as to the platform compatibility of multidex given the 'linearAlloc limit' bug. What specific versions of Android are supported? The multidex code implies back to v4 (
mc...@ebay.com <mc...@ebay.com> #9
The linearalloc limit is reached when loading classes. At install time dexopt is loading all classes contained in the dex so it's facing the limit immediately. At execution the limit may be reached after some delay dependending of the usage you have of the packaged classes. If you face it at install time but not at execution, this means you never trigger the loading of some classes. In a real application those never loaded classes should have been shrinked away manually or by Proguard. The exception is when there are different groups of classes in the dex files used in separate process.
About multidex library supported versions I've merged recently a change to try to be clearer
The summary is that the library should work down to API 4 (Donut), but below ICS applications will probably be hit by the linearalloc limit
mc...@ebay.com <mc...@ebay.com> #10
dexOptions {
additionalParameters = ['--multi-dex', '--set-max-idx-number=40000']
}
mc...@ebay.com <mc...@ebay.com> #11
je...@google.com <je...@google.com>
mc...@ebay.com <mc...@ebay.com> #14
And one question:
We have some convention plugin code which is applied to many project module. It uses the components extension's onVariants
callback to reactively trigger some data capture but needs to know whether or not minification has been enabled for the build type, configured by a separate convention plugin. We have code similar to the following:
project.plugins.withId("com.android.application") { plugin ->
val extension = project.extensions.getByType(ApplicationAndroidComponentsExtension::class.java)
val android = project.extensions.getByType(ApplicationExtension::class.java)
extension.onVariants { variant ->
...
if (android.buildTypes.getByName(variant.buildType!!).isMinifyEnabled) {
...
}
ApplicationExtension
feels like more of an input API for AGP and not something we should be programmatically querying within the onVariants
callback. Given the exposure of other configuration values (e.g. variant.pseudoLocalesEnabled
as Property<Boolean>
), should isMinifyEnabled
also be exposed?
je...@google.com <je...@google.com> #15
Alex, can you look at #11 first, then at #13.
je...@google.com <je...@google.com> #16
#14, yes, it should probably be offered in ApplicationVariantBuilder.
al...@google.com <al...@google.com>
je...@google.com <je...@google.com> #17
Alex and I looked a bit more carefully and we cannot make APK the result of the bundleToAPK task. The reason is that only one task can produce an artifact type at a time.
You cannot have an artifact type being produced by either the normal APK packaging task or the APKFromBundle task depending on what the user requested. In particular, if any script/plugin request the public APK artifact type, which task is supposed to run ?
The only way we can somehow satisfy #11 would be to have another public artifact type called APK_FROM_BUNDLE which would ensure that the bundle is created first, then the APK from the bundle. would that work ?
xa...@google.com <xa...@google.com> #18
We should really file separate bugs for all these comments. We can't just use a single bug for all of this.
xa...@google.com <xa...@google.com> #19
I filed the following specific bugs:
->comment #4 Issue 232323922 ->comment #9 Issue 232324065 ->comment #11 Issue 232325458 ->comment #14 Issue 232325329
I have not yet filed anything related to Jacoco, as we probably need to discuss things a bit internally first.
ag...@gmail.com <ag...@gmail.com> #20
We develop a convention plugin which adds some code quality tasks(detekt, checklist, lint) to the build, whenever assemble task is invoked. We achieved this on the old api by obtaining the assemble task provider with BaseVariant.getAssembleProvider() method then adding our tasks as a dependency to it. However, I couldn't find any equivalent method in the new Variant API. I thought of registering a custom Default task, which depends on SingleArtifact.APK and all of our other custom tasks, then host apps could use this new task on their local and CI machines to create apk. However, this method is not optimal since some developers still can run the assemble task directly and bypass our code quality tasks and also, this new task will not be executed when developers run their project on the Android Studio by default. So I would like a way to include some tasks to the artifact creation even if they do not produce or use this artifact. What I can suggest is a way to make artifacts depend on tasks similar to how tasks can depend on artifacts. For instance, something similar to the following would solve our use case:
variant.artifacts.get(SingleArtifact.APK).getTaskProvider().configure {
it.dependsOn("detekt${variant.name.capitalize()}")
}
Currently, what we are doing to solve this problem is the following:
project.extensions.getByType(AndroidComponentsExtension::class.java).onVariants { variant ->
project.afterEvaluate {
project.tasks.named("assemble${variant.name.capitalize()}").configure {
it.dependsOn("detekt${variant.name.capitalize()}")
}
}
}
which I know is not recommended. Therefore, could you consider adding a new api for this case?
Another thing is I couldn't find a way to obtain "lint" task for the current variant without tasks.named("lint${variant.name.capitalize()}"). Could you also create a method to obtain lint task for the current variant, add dependency tasks for it. Adding dependency for lint is important for us since we download our lint configuration with a custom task and we want this configuration to be ready when "lint" needs it. (This case could also be solved by using Provider API in the lint block of the finalizeDsl, however currently the "lintConfig" property is declared as File in this block)
And lastly, it would be nice if InternalArtifactType.JAVA_DOC_DIR was a public artifact. If our convention is applied to a library project, whenever we publish the aar, we also publish its javadoc(and kdoc) to a remote server. For this we need to add custom Javadoc task and configure it properly for the variant. However, I think, this process is a bit complex. I could not properly configure the custom Javadoc task using the new variant api. With the old API I was using the following configuration:
source = variant.getJavaCompileProvider().get().source
classpath += project.files(project.provider { androidExtension.bootClasspath })
classpath += project.files(variant.getJavaCompileProvider().map { it.classpath })
Therefore, making InternalArtifactType.JAVA_DOC_DIR public would greatly simplify our implementation and solve our problems. Currently, only solution I found was to add dependency to "javaDoc${variant.name.capitalize()}Generation" task and hardcode its output path as "intermediates${File.separator}java_doc_dir${File.separator}$componentName". But I know this is really fragile and would love to see an easier and more conventional way to accomplish this.
ww...@gmail.com <ww...@gmail.com> #21
I'm developing a plugin for external sources compilation into *.so
files.
How can I inject final *.so
files into the final APK/AAR? I saw that AGP has AndroidArtifacts.ArtifactType.JNI{_SHARED}
But I have no idea how to use it and can't find any samples. I also can't put SingleArtifact.APK
because it is transformable but not appendable.
variant.artifacts.use(task)
.wiredWith { it.outputSoFolder }
.toAppendTo(...) // <-- what to put here?
EXTRA: how to add their debug symbols to LLDB during debugging from the plugin (same as Makefiles/CMake does).
xa...@google.com <xa...@google.com> #22
AndroidArtifacts.ArtifactType
is internal and not meant to be used by our API. The thing to pass to toAppendTo
would have to be a MultipleArtifact
but we don't expose many of them yet, and none that are useful for your use case.
At some point we may expose the intermediate artifact that is the final folder of all the .so
files, but that may not be what you want either. If it's the final folder, then it's a single folder, so you cannot append to it, and you can only transform it (which means taking the content, processing it, and writing the output in a different folder). This is not efficient when you want to add new files to it.
So, your use case actually is better positioned to use sourcesets rather than inject in an intermediate. We recently introduced
mc...@ebay.com <mc...@ebay.com> #23
I just ran into a need to modify android test manifests in my convention plugin. I intended to use the artifacts API to do this but it looks like there is no SingleArtifact.*
making this available. As per the AGP 7.2 docs on MERGED_MANIFEST
: "For each module, unit test and android test variants will not have a manifest file available.". Can we please get these added as well?
dr...@gmail.com <dr...@gmail.com> #25
We are currently using the old AndroidSourceSet
APIs to configure Checkstyle and Detekt for Android projects, as described in this issue:
As far as I can tell, this is not yet covered by the new APIs and this would likely apply to other static analysis tools that need to process source files as well.
dr...@gmail.com <dr...@gmail.com> #26
I created a new ticket talking about this AndroidSourceSet
use case here:
he...@amazon.com <he...@amazon.com> #27
My project currently uses the javaCompileProvider
and preBuildProvider
APIs of the BaseVariant
class.
Fir the javaCompileProvider
case, we use this mainly in Application projects. We have some custom code generation tasks that we run that require the classpath of the application be available (so we use the output of the task), and that we want to be done "at the same time" before tasks that depend on JavaCompile complete have the code we generate ready as well. Example usage is:
android.applicationVariants.configureEach { variant ->
val customTaskOne = project.tasks.register("customTaskOne${variant.name.capitalized()") {
dependsOn(variant.javaCompileProvider, kotlinCompileTask)
}
val javaCompileOutput = variant.javaCompileProvider.get().destinationDirectory.get().asFile
val codeGenerationAction = CodeGenAction(javaCompileOutput, variant)
variant.javaCompileProvider.configure {
finalizedBy(customTaskOne)
doLast(codeGenerationAction)
}
}
I understand this is a bit janky, but I'm looking to update the code and make sure we're doing things "The Right Way(tm)" going forward. If there's a similar way to accomplish what we're looking for, I'd love to know about it, especially if it's a tool or technique that I'm not familiar with.
For the preBuildProvider
usecase, we're essentially generating some code early on in the process that just needs to be ready. I can likely use some other method of having this task run (as it doesn't really depend on anything else other than being done before the APK is packaged). What would be the suggested way with the new gradle-api
classes to perform this sort of work?
EDIT: I've spent the last few days looking through the .class
file for inclusion in the dex or generate a JSON file for inclusion in the application's assets
.
For reference, all of this is with AGP 7.4.
I tried using the same task for both to see if AGP would know how to handle that:
variant.artifacts
.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toAppend(
ScopedArtifact.CLASSES,
ScannerTask::output
)
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toGet(ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories)
but that led to things just not executing. I didn't see anything with the task name in the --debug
output. So, I went ahead and tried using two tasks: one for toGet
that would scan all the input and generate the .class
file, and one that would then take that output of that scan task and then add it using toAppend
. Attempting to do this led to a circular dependency, leading me to believe that toGet
is ALWAYS executed last.
So, I went ahead and tried using toTransform
:
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL)
.use(scannerTask)
.toTransform(
ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories,
ScannerTask::output
)
And that worked! The class was generated, and included in the dex
file. The problem was that the API was expecting me to essentially touch every input file and then add them to the output. That sounds like it's going to kill my build times.
Am I on the right track here and maybe just missing an API to use? Or is this use case not supported by the current APIs?
je...@google.com <je...@google.com> #28
You are correct, the toTransform
is the only API you can use in your case because you are trying to get the final version of the artifact in your scannerTask
while also trying to append (from the same Task). Even if you used 2 tasks, you would end up in a circular dependency.
You are also correct this is not going to be great for your build time.
One of the way I can think of would be to make a new version of toTransform
that would be a lot smarter and allow you to tag unchanged jars/directories. I think that would solve your case completely ?
But in the meantime, maybe using a KSP or plain old annotation processor might be another solution, not exactly sure about your constraints.
he...@amazon.com <he...@amazon.com> #29
I'm sure I could get that to work, having a sort of incremental toTransform
, though I don't think that would be the ideal solution for my particular use case. My attempt to use toTransform
was based off it allowing me to essentially take a look at every class that's going to end up in the package. I'm not looking to actually modify any of the classes that I scan. With the API you suggested, we would essentially be marking everything we look at as "unchanged", and then attempting to add a new class/asset based on what we saw, or perhaps modifying one or two classes/assets by adding the results of our scans.
My end goal is to essentially "append" to the output using everything previously built as an input. So, maybe not "append", but almost "finalize". That's why we currently use the finalizedBy
and doLast
APIs for the JavaCompile
task. I think my big concern would be using the "transform" API in a way that isn't strictly "transforming". If that's not one that you share, then this certainly would be worth trying out.
But in the meantime, maybe using a KSP or plain old annotation processor might be another solution, not exactly sure about your constraints.
That's certainly one of the avenues I'm investigating. I'm just trying to make sure I've taken a look at and understand all of the options that are available.
xa...@google.com <xa...@google.com> #30
The problem of a finalize
API is that only one thing can do it. If we expose this as a proper API, then we have to make sure only 1 plugin can do it and fail if 2 plugins try to do it. If we start having several published plugins using that APIs, they will not be compatible with each other.
This is really not a path we want to go down at the moment.
So "transforming" but not actually touching the files is perfectly fine (as long as you do copy them into the output), though you have to realize that the API cannot guarantee that you are last. You have to manage your plugin application order and hope your transform is added last.
he...@amazon.com <he...@amazon.com> #31
The problem of a finalize API is that only one thing can do it. If we expose this as a proper API, then we have to make sure only 1 plugin can do it and fail if 2 plugins try to do it. If we start having several published plugins using that APIs, they will not be compatible with each other.
Oh yeah, I absolutely understand the turmoil adding an API like that can cause, especially down the line. I don't blame you at all for not wanting to codify that potential nightmare in the public API. If the "transform" API is built and maintained in such a way that it accounts for folks not always actually wanting to transform classes, I think that would be sufficient.
you have to realize that the API cannot guarantee that you are last.
That's okay. No external dependencies should be using these, as they're strictly internal. We also don't need to worry about other internal plugins using/generating classes that we would be required to scan. We don't need to be "last" as much as we need to be "after compilation but before packaging", which this API would provide us.
je...@google.com <je...@google.com> #32
I have been thinking about this a bit more and it's actually not easy to provide an API where you can identify some untouched inputs as outputs.
The main reason is that Gradle will complain if 2 tasks output the same file/directory, so one way of another, we must copy the inputs into outputs which is probably what you already do. At least we would save the merging step which is an improvement but there would still be a fair amount of I/O.
he...@amazon.com <he...@amazon.com> #33
Would it be possible to have a "Read Only" API that allows scanning/reading of the non-generated code for the project which would be followed by tasks that perform this code generation/modification? Something like
variant.artifacts
.forScope(ScopedArtifacts.Scope.ALL)
.scan(scannerTask)
.andOutput(writeTask)
.with(
ScopedArtifact.CLASSES,
ScannerTask::allJars,
ScannerTask::allDirectories,
WriteTask::output
)
So, my scannerTask
would be responsible for running over the classes, and building up the manifest that it wants to generate. Then, the writeTask
would take that manifest and generate code to the output
directory. This way you have a very clear set of processing at output tasks. On my side, I could keep the manifest in memory, since I expect the tasks to be run in a pair (as in, writeTask
wouldn't run if scannerTask
hasn't). If it's better practice, scannerTask
can be used as an input for writeTask
.
I can see the scan
API being useful for any sort of processing on the APK that needs to be done, including any sort of reporting folks might want. It can allow classes to be scanned, but not necessarily modify the output. However, if andOutput
were included, the writeTask
would be added alongside the other code modifying tasks. I think the order here can make sense if the scannerTask
were always to operate on the non-generated, non-modified classes.
I'm not sure if that all makes sense as I'm spitballing.
je...@google.com <je...@google.com> #34
you are still introducing a circular reference, as the Scanner Task wants to have access to all the final CLASSES and generate a manifest that the WriterTask would use to generate a new element of the CLASSES artifact. The fundamental issues here are :
- provide an API that allows to transform but mostly leaving original items unchanged.
- be independent of the Plugin apply order.
mostly something like :
variant.artifacts.forScope(ScopedArtifacts.Scope.ALL) .use(scannerTask) .toGetAndAdd( ScopedArtifact.CLASSES, ScannerTask::allJars, ScannerTask::allDirectories, ScannerTask::output )
but that means you are not guaranteed to have the final version of CLASSES as some other Plugin may add a folder after you...
he...@amazon.com <he...@amazon.com> #35
That's why I was trying to phrase it as "non-generated", but I'm not sure how useful that would be outside of my specific use case (and obviously nobody wants to support an API for some weird one-off). For our project, what other plugins might do work after we perform ours doesn't matter, as I can guarantee for my project, in this instance, that things will behave as I expect.
If this is something you think might be worthwhile to add, with the above caveats, that would be swell. We'll likely use it in some form. If it's not something that seems like it would be worth supporting, which is not unreasonable, I'm sure I'll figure out some other solution before the time comes to migrate. Necessity is the mother of invention, after all.
I appreciate the discussion and consideration.
da...@gmail.com <da...@gmail.com> #36
Hi,
I wrote a plugin that generates a wrapper around string resources, so I had a mockable class while still being able to use string resources in viewModels etc.
The way I did this before was to add a dependency on the process<...>Resources task, find the R.jar file, unzip and use a class visitor to extract the string/plural names. Then use codegen to create the wrapper with this data. Then I had to add my task as a dependency on compile<...>Kotlin.
It's taken me a while to get to grips with the new system, just trying to slot my task to run at the correct time, but I've ended up with something like the following:
project.plugins.withType(AppPlugin::class.java) {
val androidComponents = project.extensions.getByType(AndroidComponentsExtension::class.java)
androidComponents.onVariants { variant ->
variant.sources.java?.let { sources ->
val generateStringsTask = project.tasks.register(
"generate${variant.name}Strings",
GenerateVariantStringsTask::class.java,
) {
it.variantPackageName.set(variant.namespace)
it.outputDir.set(project.layout.buildDirectory.dir("generated/source/stringrepository/${variant.name}"))
}
variant.artifacts
.use(generateStringsTask)
.wiredWith(GenerateVariantStringsTask::symbolsFile)
.toListenTo(SingleArtifact.RUNTIME_SYMBOL_LIST)
sources.addGeneratedSourceDirectory(
generateStringsTask,
GenerateVariantStringsTask::outputDir
)
}
}
}
I'll admit, this new way is a hell of a lot easier to get the data I need, The problem is that the amount of exposed artifacts is pretty limited (at least compared to what I've seen for InternalArtifactType).
RUNTIME_SYMBOL_LIST includes transitive dependencies, breaking my generated code as the module's R class can no longer 'see' those transitive resources (with the default APG 8+ non transitive R class setting).
Ideally having a non-transitive version of that artifact (to the local_only_symbol_list file?) would be perfect, but considering my confusing on picking up these new APIs there's a good chance i'm wrong on approach for this.
Any guidance would be very much appreciated.
Thanks
je...@google.com <je...@google.com> #37
#36, I filed
mu...@gmail.com <mu...@gmail.com> #38
Thank you for the great work you do keep it up please.
Description
As part of the Android Gradle plugin team's plan to help developers more easily upgrade to newer versions of the Android Gradle plugin, we need to migrate plugins and build scripts off using internal implementation details of the plugin. See the Gradle plugin roadmap for more details about our planned timeline.
This issue aims to capture use cases that are not currently supported by the APIs of the Android Gradle Plugin.
If you have a use case for extending your Android app or library build that is not covered by the APIs that are available in the com.android.tools.build:gradle-api maven artifact, such as directly reading or modifying the tasks registered by the Android Gradle plugin, please comment here, explaining what you're trying to achieve.
For example, explaining "I have a custom static analysis tool to run locally and on CI that that needs all the shrinking configuration, and the final APK as an input, and at the moment I'm maintaining a custom task that consumes these intermediate files as inputs" is more helpful than just stating the intermediate files without the bigger picture about what you want to use them for.
Even if your use case is similar, but not identical, to one already posted, please include it here, to make sure we're aware of your specific use case.
We're planning to remove the old APIs in Android Gradle plugin 9.0 (Mid 2023), and we want to minimise the disruption caused by that as much as we can.
Thanks for your help in improving the build experience for all developers!