Verified
Status Update
Comments
cc...@google.com <cc...@google.com> #2
Yigit, do you have time to fix it?
reemission of the same liveData is racy
reemission of the same liveData is racy
tw...@googlemail.com <tw...@googlemail.com> #3
yea i'll take it.
tw...@googlemail.com <tw...@googlemail.com> #4
Thanks for the detailed analysis. This may not be an issue anymore since we've started using Main.immediate there but I' not sure; I'll try to create a test case.
cc...@google.com <cc...@google.com> #5
just emitting same live data reproduces the issue.
@Test
fun raceTest() {
val subLiveData = MutableLiveData(1)
val subject = liveData(testScope.coroutineContext) {
emitSource(subLiveData)
emitSource(subLiveData) //crashes
}
subject.addObserver().apply {
testScope.advanceUntilIdle()
}
}
@Test
fun raceTest() {
val subLiveData = MutableLiveData(1)
val subject = liveData(testScope.coroutineContext) {
emitSource(subLiveData)
emitSource(subLiveData) //crashes
}
subject.addObserver().apply {
testScope.advanceUntilIdle()
}
}
tw...@googlemail.com <tw...@googlemail.com> #6
With 2.2.0-alpha04 (that use Main.immediate), the issue seems to be still there (I tested it by calling emitSource() twice, like your test case)
tw...@googlemail.com <tw...@googlemail.com> #7
yea sorry immediate does not fix it.
I actually have a WIP fix for it:
https://android-review.googlesource.com/c/platform/frameworks/support/+/1112186
if your case is the one i found (emitting same LiveData multiple times, as shown in #5) you can work around it by adding a dummy transformation.
val subLiveData = MutableLiveData(1)
val subject = liveData(testScope.coroutineContext) {
emitSource(subLiveData.map {it })
emitSource(subLiveData.map {it} )
}
I actually have a WIP fix for it:
if your case is the one i found (emitting same LiveData multiple times, as shown in #5) you can work around it by adding a dummy transformation.
val subLiveData = MutableLiveData(1)
val subject = liveData(testScope.coroutineContext) {
emitSource(subLiveData.map {it })
emitSource(subLiveData.map {it} )
}
cc...@google.com <cc...@google.com> #8
Project: platform/frameworks/support
Branch: androidx-master-dev
commit af12e75e6b4110f48e44ca121466943909de8f06
Author: Yigit Boyar <yboyar@google.com>
Date: Tue Sep 03 12:58:11 2019
Fix coroutine livedata race condition
This CL fixes a bug in liveData builder where emitting same
LiveData source twice would make it crash because the second
emission registry could possibly happen before first one is
removed as source.
We fix it by using a suspending dispose function. It does feel
a bit hacky but we cannot make DisposableHandle.dispose async
and we do not want to block there. This does not mean that there
is a problem if developer disposes it manually since our emit
functions take care of making sure it disposes (and there is
no other way to add source to the underlying MediatorLiveData)
Bug: 140249349
Test: BuildLiveDataTest#raceTest_*
Change-Id: I0b464c242a583da4669af195cf2504e2adc4de40
M lifecycle/lifecycle-livedata-ktx/api/2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/current.txt
M lifecycle/lifecycle-livedata-ktx/api/public_plus_experimental_2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/public_plus_experimental_current.txt
M lifecycle/lifecycle-livedata-ktx/api/restricted_2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/restricted_current.txt
M lifecycle/lifecycle-livedata-ktx/src/main/java/androidx/lifecycle/CoroutineLiveData.kt
M lifecycle/lifecycle-livedata-ktx/src/test/java/androidx/lifecycle/BuildLiveDataTest.kt
https://android-review.googlesource.com/1112186
https://goto.google.com/android-sha1/af12e75e6b4110f48e44ca121466943909de8f06
Branch: androidx-master-dev
commit af12e75e6b4110f48e44ca121466943909de8f06
Author: Yigit Boyar <yboyar@google.com>
Date: Tue Sep 03 12:58:11 2019
Fix coroutine livedata race condition
This CL fixes a bug in liveData builder where emitting same
LiveData source twice would make it crash because the second
emission registry could possibly happen before first one is
removed as source.
We fix it by using a suspending dispose function. It does feel
a bit hacky but we cannot make DisposableHandle.dispose async
and we do not want to block there. This does not mean that there
is a problem if developer disposes it manually since our emit
functions take care of making sure it disposes (and there is
no other way to add source to the underlying MediatorLiveData)
Bug: 140249349
Test: BuildLiveDataTest#raceTest_*
Change-Id: I0b464c242a583da4669af195cf2504e2adc4de40
M lifecycle/lifecycle-livedata-ktx/api/2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/current.txt
M lifecycle/lifecycle-livedata-ktx/api/public_plus_experimental_2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/public_plus_experimental_current.txt
M lifecycle/lifecycle-livedata-ktx/api/restricted_2.2.0-alpha05.txt
M lifecycle/lifecycle-livedata-ktx/api/restricted_current.txt
M lifecycle/lifecycle-livedata-ktx/src/main/java/androidx/lifecycle/CoroutineLiveData.kt
M lifecycle/lifecycle-livedata-ktx/src/test/java/androidx/lifecycle/BuildLiveDataTest.kt
cc...@google.com <cc...@google.com> #9
Fix to incorrect initial fetch position in #5 merged, will go out with next AndroidX paging release.
cc...@google.com <cc...@google.com> #10
Released with Paging 2.0.0-beta01 (Currently revision is 2.0.0 stable)
Description
Version used:all versions
Devices/Android versions reproduced on: all versions
Hi,
thanks for the great work.
I use a slightly modified version of the PagingSample. Instead of just adding one item when tapping on the "Add" button, I add 20 copies of the same item:
fun insert(text: CharSequence) = ioThread {
dao.insert(listOf(
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString()),
Cheese(id = 0, name = text.toString())))
}
When adding items which will be pre-pended to the list, the PositionalDataSource fails to re-load the visible items. This is a general problem.
My guess:
The PositionalDatasource doesn't know if newly inserted items will be pre-pended by the query or appended. Let's say we have the following scenario:
- initially loading items from position 20
- pre-pending 50 new items
- results in datasource invalidation and new initial load from position 20
- now there are completely different items at position 20
- async diff sees changes and triggers animations
Somehow this happens to be related to the page size used. When increasing the page size, it happens after more inserts. When decreasing the page size, this behavior happen after just the first inserts.
My question is:
How to handle pre-pending inserts with the PositionalDataSource?