Status Update
Comments
bl...@google.com <bl...@google.com>
gi...@gmail.com <gi...@gmail.com> #2
ya...@google.com <ya...@google.com> #3
Before the feature is released, you can save the timestamp in your schema to work around.
gi...@gmail.com <gi...@gmail.com> #4
le...@wisestamp.com <le...@wisestamp.com> #5
be...@cmegroup.com <be...@cmegroup.com> #6
[Deleted User] <[Deleted User]> #7
Is this implemented already ? If yes, how can it be used ?
ya...@google.com <ya...@google.com> #8
sh...@gmail.com <sh...@gmail.com> #9
If I do a daily reload, how can I be sure I won't lose data if I don't have stream timestamp to compare to?
[Deleted User] <[Deleted User]> #10
In reply to Jan 17, 2018 02:52PM, how can we save the timestamp in our schema?
ta...@gmail.com <ta...@gmail.com> #11
gu...@gmail.com <gu...@gmail.com> #12
[Deleted User] <[Deleted User]> #13
or...@doubleverify.com <or...@doubleverify.com> #14
ji...@google.com <ji...@google.com> #15
bi...@google.com <bi...@google.com>
[Deleted User] <[Deleted User]> #16
sh...@gmail.com <sh...@gmail.com> #17
ta...@gmail.com <ta...@gmail.com> #18
bi...@google.com <bi...@google.com> #19
ra...@gmail.com <ra...@gmail.com> #20
cc...@sperdegroot.nl <cc...@sperdegroot.nl> #21
so, the string value 'AUTO' gets converted to CURRENT_TIMESTAMP(), at the moment of insertion? That is useful, but not quite what we were hoping for (since we would like it to be defined in the schema, not in the data)
as...@manh.com <as...@manh.com> #22
'the string value 'AUTO' gets converted to CURRENT_TIMESTAMP(), at the moment of insertion?".
Basically, i like to use this field to filter out already processed rows from a staging table. The rows are coming via dataflow and the rows can come out of order. So, was thinking this new field will be more accurate for my filtering out.
ze...@gmail.com <ze...@gmail.com> #23
I get an error:
Failure details:
- Error while reading data, error message: JSON processing
encountered too many errors, giving up. Rows: 1; errors: 1; max
bad: 0; error percent: 0
- Error while reading data, error message: JSON parsing error in row
starting at position 0: Couldn't convert value to timestamp: Could
not parse 'auto' as a timestamp. Required format is YYYY-MM-DD
HH:MM[:SS[.SSSSSS]] or YYYY/MM/DD HH:MM[:SS[.SSSSSS]] Field:
time_stamp; Value: auto
here part of the son schema
{
"mode": "NULLABLE",
"name": "iana_flowEndSysUpTime",
"type": "INTEGER"
},
{
"name": "time_stamp",
"type": "TIMESTAMP"
}
]
and here the data:
{"time_stamp":"auto","type":"ipfix.entry","iana_flowEndS
any idea why it's not working?
bi...@google.com <bi...@google.com> #24
pe...@avanza.se <pe...@avanza.se> #25
[Deleted User] <[Deleted User]> #26
Are you SURE that AUTO works? In my NEWLINE_DELIMITED_JSON file I've tried:
- "foo":"AUTO"
- "foo":'AUTO'
- "foo":"'AUTO'"
- "foo":AUTO
and using bq load
command, I get the above mentioned error for all of these permutations:
- Error while reading data, error message: JSON parsing error in row
starting at position 0: Couldn't convert value to timestamp: Could
not parse 'AUTO' as a timestamp. Required format is YYYY-MM-DD
HH:MM[:SS[.SSSSSS]] or YYYY/MM/DD HH:MM[:SS[.SSSSSS]] Field:
foo; Value: AUTO
mi...@poplindata.com <mi...@poplindata.com> #27
pd...@bendingspoons.com <pd...@bendingspoons.com> #28
Hello, AUTO
works for me when passed in the JSON payload.
However, I don't see it documented anywhere in the BigQuery official docs. Indeed, I discovered it from this bug report, linked from StackOverflow.
I think that this really nice feature should be mentioned and be given more visibility. :-)
[Deleted User] <[Deleted User]> #29
al...@gmail.com <al...@gmail.com> #30
me...@gmail.com <me...@gmail.com> #31
[Deleted User] <[Deleted User]> #32
[Deleted User] <[Deleted User]> #33
mb...@gmail.com <mb...@gmail.com> #34
Loading json files with 4 records each file and would like to append inserted datetime for each row without having to append my json (Python manipulation)
el...@papaya.com <el...@papaya.com> #35
We are still trying to get the insertion time of the rows.
em...@gmail.com <em...@gmail.com> #36
Created the ticket bq load
.
fe...@dito.com.br <fe...@dito.com.br> #37
in...@gmail.com <in...@gmail.com> #38
Hello
mk...@gmail.com <mk...@gmail.com> #39
Art
lu...@awantec.my <lu...@awantec.my> #40
kindly update on this request. its very important to have timestamp of every row inserted
si...@aboutyou.com <si...@aboutyou.com> #41
as...@gmail.com <as...@gmail.com> #42
vi...@keepler.io <vi...@keepler.io> #43
bi...@google.com <bi...@google.com> #44
For load job API, please follow up with
The column default value (CURRENT_TIMESTAMP) is supported there.
Description