Assigned
Status Update
Comments
va...@google.com <va...@google.com>
nr...@google.com <nr...@google.com> #2
Hello,
Thank you for reaching out to us with your request.
We have duly noted your feedback and will thoroughly validate it. While we cannot provide an estimated time of implementation or guarantee the fulfillment of the issue, please be assured that your input is highly valued. Your feedback enables us to enhance our products and services.
We appreciate your continued trust and support in improving our Google Cloud Platform products. In case you want to report a new issue, Please do not hesitate to create a new issue on the
Once again, we sincerely appreciate your valuable feedback; Thank you for your understanding and collaboration.
Description
Please provide as much information as possible. At least, this should include a description of your issue and steps to reproduce the problem. If possible please provide a summary of what steps or workarounds you have already tried, and any docs or articles you found (un)helpful.
Problem you have encountered: The current solution only support time, data and timestamp as int (unix epoch), however a majority of date, time and timestamps in JSON messages are not expressed as int but as strings in ISO 8601 format, ex.
YYYY-[M]M-[D]D{ |T|t}[H]H:[M]M:[S]S[.F]
This requires users to introduce a transformation step in between the source and target (BigQuery) and pretty much the whole value of using BigQuery subscription get lost as we could let the transformation step take care of the ingestion in that case. I would be happy to start using the bigquery subscription, but this is currently a blocker.
What you expected to happen:
That the string representation with ISO 8601 format of time, date and timestamps would be ingested into the corresponding fields in the bigquery table without first converting them to ints.
Steps to reproduce:
Using a table like
CREATE OR REPLACE TABLE test.time_types(
a_timestamp TIMESTAMP,
a_time TIME,
a_date DATE
)
and publishing a message like
{
"a_timestamp": "2024-03-04T22:01:01.123456",
"a_time": "22:01:01",
"a_date": "2024-03-04"
}
Would insert the values in the BigQuery table, but instead I get the following error message "(a_date): invalid value "2024-03-04" for type TYPE_INT32"
However, doing an insert like this works:
INSERT INTO `streamprocessor-org.test.time_types` VALUES("2024-03-04T22:01:01.123456", "22:01:01","2024-03-04")
Other information (workarounds you have tried, documentation consulted, etc):