Assigned
Status Update
Comments
ju...@google.com <ju...@google.com> #2
The product team is currently aware of this feature request, but there is no ETA on the implementation of this request. Please star here to get further updates, thanks!
Description
What you would like to accomplish:
- Customer would like to use ML Engine Online Predictions on image data which exceeds the prediction request payload size [1](since image data easily exceeds 1.5 MB).
[1]
How this might work:
- If the request limit size can be increased to higher limit or the product team can add better image processing support for Online Prediction, then it would be great.
If applicable, reasons why alternative solutions are not sufficient:
- a potential way to workaround this limit is by splitting up the data into multiple prediction requests, and having ML Engine wait and aggregate the data once all of the multiple prediction requests have been sent; unfortunately this is not currently available, and the prediction results will be processed and returned right when the request is sent.
Other information (workarounds you have tried, documentation consulted, etc):