Change theme
Help
Press space for more information.
Show links for this issue (Shortcut: i, l)
Copy issue ID
Previous Issue (Shortcut: k)
Next Issue (Shortcut: j)
Sign in to use full features.
Vote: I am impacted
Notification menu
Refresh (Shortcut: Shift+r)
Go home (Shortcut: u)
Pending code changes (auto-populated)
View issue level access limits(Press Alt + Right arrow for more information)
Request for new functionality
View staffing
Description
Please provide as much information as possible. At least, this should include a description of your issue and steps to reproduce the problem. If possible please provide a summary of what steps or workarounds you have already tried, and any docs or articles you found (un)helpful.
Problem you have encountered:
The customer is looking a straightforward manner to debug their issues they understand that documentation says the limit of 10,000 characters [1] However the batch prediction is failing and there is no easy way to find the problematic file and it's hard to debug from the customer side. [1]https://cloud.google.com/nodejs/docs/reference/automl/latest/automl/v1.predictionserviceclient#:~:text=TextSnippet%20up%20to%2010%2C000%20characters
What you expected to happen:
Ideally, error should say line number similar to the error with long characters
Alternatively, it should also continue with the batch job for other text samples which doesn’t have this issue and complete processing with “Warning”
Ideally should incorporate line number in that warning so we can exclude for our analysis
Steps to reproduce:
Other information (workarounds you have tried, documentation consulted, etc):