Status Update
Comments
oy...@gmail.com <oy...@gmail.com> #2
Project: platform/frameworks/support
Branch: androidx-main
Author: Kuan-Ying Chou <
Link:
Remove ToT dependencies in wear:compose:compose-navigation
Expand for full commit details
Remove ToT dependencies in wear:compose:compose-navigation
Bug: 374102924
Test: n/a
Change-Id: Id057c7254a0282eab77a2facb4d278e3b5ccf784
Files:
- M
wear/compose/compose-navigation/samples/build.gradle
Hash: 285f596b323a37b82e07621acb990847b8cf4328
Date: Thu Oct 31 13:36:53 2024
mi...@google.com <mi...@google.com>
mi...@google.com <mi...@google.com> #3
Project: platform/frameworks/support
Branch: androidx-main
Author: Kuan-Ying Chou <
Link:
Update Kotlin targets to 1.9
Expand for full commit details
Update Kotlin targets to 1.9
Update Kotlin target of savedstate and projects that depend on it to 1.9.
Bug: 374102924
Test: existing tests still pass
Change-Id: I442310ec57f7db381c16f95985a952deb9af57b2
Files:
- M
lifecycle/lifecycle-viewmodel-compose/build.gradle
- M
lifecycle/lifecycle-viewmodel-compose/samples/build.gradle
- M
lifecycle/lifecycle-viewmodel-savedstate/build.gradle
- M
lifecycle/lifecycle-viewmodel-testing/build.gradle
- M
savedstate/savedstate-ktx/build.gradle
- M
savedstate/savedstate/build.gradle
- M
savedstate/savedstate/src/nativeMain/kotlin/androidx/savedstate/internal/SynchronizedObject.native.kt
Hash: 73305d3cf9d9bfff42b1836ac0127032b7721496
Date: Tue Oct 29 17:27:51 2024
oy...@gmail.com <oy...@gmail.com> #4
Project: platform/frameworks/support
Branch: androidx-main
Author: Kuan-Ying Chou <
Link:
Add functions to encode serializable objects to and decode serializable objects from SavedState
.
Expand for full commit details
Add functions to encode serializable objects to and decode serializable objects from `SavedState`.
We do this by implementing an `Encoder` and a `Decoder` that writes and reads a `SavedState` output format for Kotlin Serialization (https://github.com/Kotlin/kotlinx.serialization). The format represents a Kotlin Serialization "primitive" as their corresponding `SavedState` supported type, and represents a "composite" in a `SavedState` with its property names as keys. Nested composites are represented by nested `SavedState`s.
Here are some other design choices we made:
1. We don't record size for collections and rely on the size of the `SavedState` in decoding so that collections and non-collections are treated the same.
2. To save space, we don't encode default values by default (it can be tweaked with `@EncodeDefault` with plugin-generated serializers).
3. To support nullable parameters with default arguments (e.g. `val a: String? = "foo"`) and for `SavedState` size to be correct for collections, we encode `null`s.
4. To keep things simple we don't support specifying custom `SerializersModule`s for now.
Please note that when using Kotlin Serialization and these functions as a `Parcelable` alternative on Android there's a performance overhead because of the extra serialization needed.
Also note that on Android we don't have built-in support for Android or Java specific types supported by `Bundle` or `Parcel` yet (e.g. `Exception`, `java.io.Serializable`, or `IBinder`). We may consider adding these types in the future.
Relnote: Add encodeToSavedState() and decodeFromSavedState() functions
Test: SavedStateCodecTest.kt
Bug: 374102924
Change-Id: I6f59faffaa3777bf56132a67f41b867d7a9663e5
Files:
- M
development/build_log_simplifier/messages.ignore
- M
docs-tip-of-tree/build.gradle
- A
savedstate/savedstate-samples/build.gradle
- A
savedstate/savedstate-samples/src/main/java/androidx/savedstate/SavedStateCodecSamples.kt
- M
savedstate/savedstate/api/current.txt
- M
savedstate/savedstate/api/restricted_current.txt
- M
savedstate/savedstate/bcv/native/current.txt
- M
savedstate/savedstate/build.gradle
- A
savedstate/savedstate/src/androidUnitTest/kotlin/androidx/savedstate/SavedStateCodecAndroidTest.android.kt
- A
savedstate/savedstate/src/commonMain/kotlin/androidx/savedstate/serialization/SavedStateDecoder.kt
- A
savedstate/savedstate/src/commonMain/kotlin/androidx/savedstate/serialization/SavedStateEncoder.kt
- A
savedstate/savedstate/src/commonTest/kotlin/androidx/savedstate/SavedStateCodecTest.kt
- A
savedstate/savedstate/src/commonTest/kotlin/androidx/savedstate/SavedStateCodecTestUtils.kt
- M
settings.gradle
Hash: 57162ea74b646ef7ae43397251fd4cd32e2d80c3
Date: Thu Oct 17 19:10:05 2024
mi...@google.com <mi...@google.com> #5
oy...@gmail.com <oy...@gmail.com> #6
me.
The prediction was able to happen successfully in the warm-up request, but
when i try a real predict request via the predict endpoint, It fails..
It looks like those files are only available within the from_path context.
def predict(self, instances, **kwargs):
import output
# assume i get the payload here
# clean_str = urllib.parse.quote_plus("")
sys.stderr.write(json.dumps({'instance': instances}))
sys.stderr.write("\n")
sys.stderr.write(str(type(instances)))
"""
filelist = []
for root, dirs, files in os.walk("/tmp"):
for file in files:
# append the file name to the list
filelist.append(os.path.join(root, file))
# print all the file names
for name in filelist:
sys.stderr.write(name)
"""
return output.output(clean_str="", req_path="", dir_path=self.model_dir)
@classmethod
def from_path(cls, model_dir):
a = os.path.join(model_dir, '4/barkbox/v21_cparam2_idfTrue.sav')
with open(a, 'rb') as f:
model = pickle.load(f)
predictor = cls(model_dir, model)
outputs = predictor.predict([[1, 2, 3, 4, 5]]) # Here goes your
warm up prediction request
return predictor
The warm-up prediction request seems to work and I believe it is still
calling my predict method, why doesn't it work when I use the predict
endpoint
[image: Screenshot 2020-08-26 at 12.21.31 PM.png]
The predict endpoint doesn't work, It says it can't find the file on the
server, please what exactly am I missing?
Is it that I need to pass the loaded model down into my files, or the main
predict code has to be in the predict method exactly. Notice that the main
predict code is in another output file. But why does the warm-up predict
work in the from_path method which I believe still uses the same predict
code from the output module? And why doesn't it work from the normal
predict endpoint.
[image: Screenshot 2020-08-26 at 12.21.42 PM.png]
Thanks for your help so far.
On Tue, Aug 25, 2020 at 7:46 PM <buganizer-system@google.com> wrote:
mi...@google.com <mi...@google.com> #7
Hi,
I ran some additional tests and I confirmed that the files located in model_dir are only available during the execution of the from_path
function, i.e., when the model gets loaded.
Although this behavior is not completely described in the documentation, it is
AI Platform Prediction prediction nodes use the from_path class method to load an instance of your Predictor. This method should load the artifacts you saved in your model directory, the contents of which are copied from Cloud Storage to a location surfaced by the model_dir argument.
With this in mind, let me address your concerns:
-
The predict endpoint doesn't work, It says it can't find the file on the server, please what exactly am I missing? Is it that I need to pass the loaded model down into my files [...]
Yes, the model and its artifacts should be loaded in the
from_path
method and stored in your prediction class. See the example from the .documentation -
[...] the main predict code has to be in the predict method exactly. Notice that the main predict code is in another output file.
No, the code that makes the prediction can be located in another file that gets imported.
-
But why does the warm-up predict work in the from_path method which I believe still uses the same predict code from the output module? And why doesn't it work from the normal predict endpoint.
This is explained at the beginning of this comment. The files in model_dir are only available during the execution of
from_path
. That's why if you are accessing those files during a normal prediction requests, these are not found.
I hope that the information has been helpful, please try loading the model and the necessary artifacts in the from_path
method. Let me know if after doing so, you are still running into this issue.
Kind Regards.
oy...@gmail.com <oy...@gmail.com> #8
If you notice my predict method doesn't actually use the model I loaded in
the from_path method. It is still the same method that is called both in
the warm-up request and the main predict request.
I am currently loading the model in the from_path method based on the mail
I sent to you, but my predict method is not using that model that was
loaded. But it still worked for the warm-up request
Are you saying that I should pass down that model to the code that does the
actual prediction?
I am sorry for asking too many questions, It is just that I am quite
confused.
The warm-up request doesn't use the model I loaded in the from_path, the
actual predict code reloads the models it needs.
I am just confused as to why it worked in that context but doesn't work in
a different context.
On Wed, Aug 26, 2020 at 3:38 PM <buganizer-system@google.com> wrote:
mi...@google.com <mi...@google.com> #9
Hi,
I understand the confusion. The warm up request works because even though the model is not loaded explicitly in the from_path
function, the warm_up prediction is executed in the same context, i.e.. from_path
is executed first to load the model, and at that point the files are available through model_dir. At a later time, in a prediction, from_path
is not called, only predict
, and the model_dir files are not longer available.
Tracing back the calls, it would be something like the following:
- When the model is loaded,
model_dir
is available
from_path
is calledpredict
is called withinfrom_path
output
is called withinpredict
# At this point,model_dir
files are available
- In a normal predict request,
model_dir
is not available
predict
is calledoutput
is called withinpredict
# This fails becausemodel_dir
files are not available
For this reason, the idea is to load all the necessary artifacts in the from_path
call and store them as variables inside the predictor class, as shown in the
Hope this helps to clarify that concern.
oy...@gmail.com <oy...@gmail.com> #10
On Wed, Aug 26, 2020 at 4:39 PM <buganizer-system@google.com> wrote:
oy...@gmail.com <oy...@gmail.com> #11
I have one more issue I would like you to help me with.
I can't seem to install the google cloud language library on the server
from setup.py
[image: Screenshot 2020-08-27 at 11.59.52 AM.png]
Here is the install_requires in my setup.py
install_requires=[
'Unidecode>=1.1.1',
'urllib3>=1.25.9',
'pyap>=0.1.0',
'Keras==2.3.0',
'pymongo>=3.10.1',
'datefinder>=0.6.1',
'lxml',
'scikit-multilearn==0.0.5',
'google-api-core==1.16.0',
'google-auth==1.15.0',
'google-cloud-language==1.3.0',
'grpc-google-iam-v1==0.12.3',
'grpcio==1.29.0'
]
It works well without any of the google packages. Please what should i do?
On Wed, Aug 26, 2020 at 4:52 PM Oyinkansola Ariyo <
oyinkansolaariyo@gmail.com> wrote:
oy...@gmail.com <oy...@gmail.com> #12
I would also like to ask, how can I give my storage bucket file domain
access. I want to be able to access those files from the ai platform server.
What domain should I add?
Looking forward to hearing from you
On Thu, Aug 27, 2020 at 12:16 PM Oyinkansola Ariyo <
oyinkansolaariyo@gmail.com> wrote:
mi...@google.com <mi...@google.com> #13
Hi,
I understand that you have an issue with the google-cloud-language library, and a question about your storage bucket. Please consider that Issue Tracker is meant to be used to report defects on the service, and each issue should be tracked separately.
For this reason, to address your concern about the google-cloud-natural library, I opened a new Issue Tracker on your behalf, you can see it
Regarding the Storage question, please use an appropriate support channel for technical questions, such as
I appreciate your understanding on this. Since the original issue has been solved now, I'll proceed to close this Issue Tracker, please feel free to open a new one in case you have additional questions or concerns regarding the "No such file or directory" issue.
Description
The classpath method on my predictor class gives the model_dir as '/tmp/model/0001', but this path doesn't exist.
Please how do I go about this and what am I missing. I am able to create the model version successfully when I make a request to predict.