Status Update
Comments
uc...@google.com <uc...@google.com>
[Deleted User] <[Deleted User]> #2
.
r....@gmail.com <r....@gmail.com> #3
.
ga...@google.com <ga...@google.com>
[Deleted User] <[Deleted User]> #4
Android Studio Arctic Fox | 2020.3.1 Canary 13
Build #AI-203.7148.57.2031.7242491, built on March 29, 2021
Runtime version: 11.0.8+0-b944-P17168821 amd64
ga...@google.com <ga...@google.com> #5
See attached of how to reproduce it on canary 13 too:
Android Studio Arctic Fox | 2020.3.1 Canary 13
Build #AI-203.7148.57.2031.7242491, built on March 29, 2021
Runtime version: 11.0.8+10-b944.6842174 amd64
VM: OpenJDK 64-Bit Server VM by N/A
Windows 10 10.0
GC: G1 Young Generation, G1 Old Generation
Memory: 4000M
Cores: 12
Registry: external.system.auto.import.disabled=true, ide.settings.move.mouse.on.default.button=true, debugger.watches.in.variables=false
Non-Bundled Plugins: Show As ..., String Manipulation, com.dubreuia, com.intellij.marketplace, org.jetbrains.kotlin, com.google.mad-scorecard, org.intellij.plugins.markdown
r....@gmail.com <r....@gmail.com> #6
Is it still reproducible with the new Logcat in Studio Electric Eel?
sp...@google.com <sp...@google.com> #7
I don't remember what was written on the project a year ago...
Please restore the files so that I could check it out.
Also please stop deleting my files.
r....@gmail.com <r....@gmail.com> #8
The files were deleted due to expiration of the retention period for restricted attachments. They are not recoverable.
ga...@google.com <ga...@google.com> #9
Otherwise there is no point in writing here as you answer too late for me to check it out again, because you don't give me a chance to do it.
sp...@google.com <sp...@google.com> #10
Sorry it took us so long to address this bug. That said, the Logcat tool window was rewritten from scratch so this bug is no longer relevant.
[Deleted User] <[Deleted User]> #11
Can you please fix the way it shows the text, though?
I mean this:
Many times I forget it even exists, and I see the time instead of the text that I printed...
sp...@google.com <sp...@google.com> #12
Thanks! Looks like the issue is FileUtils.isFileInDirectory
. I have a WIP change from a couple months ago (ChangeId Ia88494c0b353545bf1ea3035ef8f165c1f5923e4) to improve the performance of this method, but I never merged it :/
[Deleted User] <[Deleted User]> #13
Is there any chance this could be ported to a 4.1 fix release or 4.2 ? I would be happy to test a branch / build of AGP for this (through Ivan ?)
je...@google.com <je...@google.com> #14
On Wed, Dec 16, 2020 at 6:40 AM <buganizer-system@google.com> wrote:
sp...@google.com <sp...@google.com> #15
I was able to repro this issue by creating an app project with dependencies on 500 different AARs with .so files.
When I shuffled the order of the AAR dependencies in the build.gradle file to simulate the nondeterministic ordering of the externalLibNativeLibs
task input, the mergeDebugNativeLibs
task takes about 50 seconds.
One possible workaround for this issue would be to use @InputFiles
instead of @Classpath
in MergeNativeLibsTask
.
Re #9, is there a bug filed for the 2nd issue, Ivan?
[Deleted User] <[Deleted User]> #16
Note that a bug report has been open on the Gradle repo as well as it seems that the cache key computation issue is more general than the mergeDebugNativeLibs
task :
To summarize: mergeDebugNativeLibs
seems to have a performance issue in incremental mode and this is the focus of this report, while the gradle ticket focuses on the reproducibility of the cache key computation for a few classes that see their inputs order shuffled.
ga...@google.com <ga...@google.com> #17
Scott, can we try to optimize an incremental scenario with many changes, and if the cost is still too high, we can fallback to clean run if number of changed files is greater than some threshold? WRT other issue (classpath ordering), I've just filled
sp...@google.com <sp...@google.com> #18
Discussed offline. Current plan is the following:
- In 4.2, run the task non-incrementally if the number of changed inputs is above a certain threshold
- In 7.0, fix the performance issue for incremental runs
sp...@google.com <sp...@google.com> #19
4.2 beta 4 has the fix (workaround) for this. It's slated for release on Monday, Jan 11.
ga...@gmail.com <ga...@gmail.com> #20
[Deleted User] <[Deleted User]> #21
Thank you very much for the update @sp, we're eager to try it out.
sp...@google.com <sp...@google.com>
[Deleted User] <[Deleted User]> #22
Confirmed, it seems to be fixed in AGP 4.2 beta 04. Thx you very much for all your work!
ha...@gmail.com <ha...@gmail.com> #23
```
android {
// yout existing code
packagingOptions {
pickFirst '**/libc++_shared.so'
pickFirst '**/libfbjni.so'
}
}
```
Description
Studio Build: Version of Gradle Plugin: 4.1 Version of Gradle: 6.7 Version of Java: JDK 11.0+ Version of Kotlin Gradle Plugin: 1.4.20 OS: Mac 10.15.7
Steps to Reproduce: launch a build from command line. The problem is intermittent. It randomly impacts our developers and impacts our benchmarks (using gradle profiler, 6 measured builds, 50% are faulty). The build does not contain native libraries.
We can see that the task
mergeDebugNativeLibs
has a random up-to-date status and when it runs, it takes around 160 seconds. This has a large impact on our builds.When we compare builds with the faulty
mergeDebugNativeLibs
vs builds that go well, we can see (via Gradle Enterprise build scans) that the main difference is the up-to-date status of this task.Here attached are a few snapshots of the Gradle Enterprise scans.
To summarize:
Note that the way we run our benchmarks is 100% reproducible:
./gradlew clean <target> --no-build-cache
, but the up-to-date status of this task is random.Note 2: when running with
--rerun-tasks
we can see that the task always run fast.Note 3: we are aware that Gradle Enterprise has some issues with worker based tasks and can show a task duration much larger than it actually is because it waits for dependent tasks, but due to the inconsistency of the repro, we could not have a more accurate diagnostic (cleanTask, re-run task). And we do see that the task duration adds up to the build total duration so we don't think this is just a display issue here.