Assigned
Status Update
Comments
st...@yeti.com <st...@yeti.com> #2
This impacts me as well. I would love to see this feature implemented.
an...@google.com <an...@google.com>
ha...@bybit.com <ha...@bybit.com> #3
Totally agree! We strongly suggest this feature, split the file size please!
i....@gmail.com <i....@gmail.com> #4
Totally agree! We strongly suggest this feature, split the file size please!
lu...@itau-unibanco.com.br <lu...@itau-unibanco.com.br> #5
This would be so helpful for my team!
de...@itau-unibanco.com.br <de...@itau-unibanco.com.br> #6
Totally agree! We strongly suggest this feature, split the file size please!
je...@gmail.com <je...@gmail.com> #7
This would be so important to community :)
ma...@gmail.com <ma...@gmail.com> #8
Totally agree! We strongly suggest this feature, split the file size please!
he...@itau-unibanco.com.br <he...@itau-unibanco.com.br> #9
Totally agree! We strongly suggest this feature, split the file size please!
ma...@itau-unibanco.com.br <ma...@itau-unibanco.com.br> #10
Totally agree!!! This would be so helpful for big companies as us!
vi...@itau-unibanco.com.br <vi...@itau-unibanco.com.br> #11
I agree, this feature would be very useful
fa...@mailer.com.br <fa...@mailer.com.br> #12
Totally agree! We strongly suggest this feature, split the file size please!
lu...@itau-unibanco.com.br <lu...@itau-unibanco.com.br> #13
this resource is very useful
2d...@itau-unibanco.com.br <2d...@itau-unibanco.com.br> #14
Totally agree! We strongly suggest this feature, split the file size please!
fa...@itau-unibanco.com.br <fa...@itau-unibanco.com.br> #15
Totally agree! We strongly suggest this feature, split the file size please!
[Deleted User] <[Deleted User]> #16
Totally agree! We strongly suggest this feature, split the file size please!
bw...@google.com <bw...@google.com> #17
I never imagined we would have bots upvoting public trackers. :)
i....@gmail.com <i....@gmail.com> #18
We are not bots. We are employees of the same team trying to have this feature. :) we really need it, that’s why we’ve made this task force.
i....@gmail.com <i....@gmail.com> #19
Hi there! Any updates?
i....@gmail.com <i....@gmail.com> #20
Hi there =-) any updates?
ga...@google.com <ga...@google.com> #21
Thank you for voting on this feature. We have looked into the request but unfortunately doing this configuration in the export pipeline isn't trivial due to other systemic constraints with respect to sharding. We are looking at post export processing logic options, and the ask is in our backlog.
During this time we would encourage to look at BigQuery Storage Read API, which offers more control and flexibility for consuming BigQuery data.https://cloud.google.com/bigquery/docs/reference/storage
During this time we would encourage to look at BigQuery Storage Read API, which offers more control and flexibility for consuming BigQuery data.
ma...@sky.uk <ma...@sky.uk> #22
Hi, if at all possible I would also benefit from this feature.
pu...@google.com <pu...@google.com>
[Deleted User] <[Deleted User]> #23
This would be so important to community :)
pu...@google.com <pu...@google.com>
i....@gmail.com <i....@gmail.com> #24
Hi there, any update?
ma...@gmail.com <ma...@gmail.com> #25
up!
er...@gmail.com <er...@gmail.com> #26
Any update? We need this
va...@newsweek.com <va...@newsweek.com> #27
Last year the export file size is around 20-25MB per file when using wildcard, but now (2023 Sep), it came down to 1-2MB per file and the number of output files increased from 60 files to 600 files now, it's a bit of pain to use with spark.
i....@gmail.com <i....@gmail.com> #28
Totally agree, va...@newsweek.com! It's a bit of pain.
i....@gmail.com <i....@gmail.com> #29
Hi there, any update?
bu...@toptal.com <bu...@toptal.com> #30
Hello, any updates on this request?
va...@newsweek.com <va...@newsweek.com> #31
I've the same problem, snappy parquet used to produce 100 files with 30MB each now produces 1500+ files 1MB each.
is...@itau-unibanco.com.br <is...@itau-unibanco.com.br> #32
Hi there, any update?
jp...@bunnings.com.au <jp...@bunnings.com.au> #33
This is very much required feature, really look forward for atleast some work arounds!
md...@ford.com <md...@ford.com> #34
We have the same problem. 2000 files for 1 partition.
i....@gmail.com <i....@gmail.com> #35
Hello everyone. We are waiting for that requires feature since 2022. More than 2 years... Is there any update?
Thanks.
Thanks.
Description
What you would like to accomplish:
When Exporting data from Bigquery, it would be helpful to have an option to split the export into multiple files, depending on the desired file size.
I know there is a compressed option, but when needing to upload the data into external destinations, there is often requirements for file type and size.
How this might work:
When given the option to export BQ data as a CSV, have a File Size Limit option where we can input a desired file size. The resulting export will be all of the data into multiple files where the File Size Limit is the max file size of those files.
Problem you have encountered:
Uploading BQ CSV export to an external destination and hitting a file size limit. We have to manually split up the file into multiple section to get it through.
I've found multiple stackoverflow postings on this topic with no easy answer: