You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Background
Current method using load jobs is optimized for large loads. However most of the load packages is quite small. We can improve loading speeds by implementing streaming insert copy jobs. There's a singer target https://github.com/z3z1ma/target-bigquery by @z3z1ma from where we can take code.
we'll implement unconstrained (schema-less) version of the above when #891 in merged. this is about extending existing destination
Tasks
allow user to select the loading API via destination configuration and per resource/table via bigquery adapter
implement insert api and optionally storage write api
allow both parquet and jsonl to be loaded this way. consider adding standard file readers for those types (port them from data sink destination #891 )
tests and documentation
The text was updated successfully, but these errors were encountered:
Background
Current method using load jobs is optimized for large loads. However most of the load packages is quite small. We can improve loading speeds by implementing streaming insert copy jobs. There's a singer target https://github.com/z3z1ma/target-bigquery by @z3z1ma from where we can take code.
we'll implement unconstrained (schema-less) version of the above when #891 in merged. this is about extending existing destination
Tasks
The text was updated successfully, but these errors were encountered: