Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I extract more than 1TB of Data from BQ to Qlik Sense / SaaS everyday and it takes a lot of time and performing transformation top of it like adding another hour(s).
How to extract TBs of Data just in few minutes ? is there any quick fix between Qlik and BQ ?
Thanks
There are multiple components in this. How fast is BQ able to deliver data - you could for example test how long it takes to do the same extraction within BQ and store the result in a file (or actually it will be multiple files with that amount of data) in a google storage bucket? That's an indication of how long the BQ processing time is.
There's also the fact that it takes time to transfer data between servers. Network capacity between BQ and the Qlik server is a factor. If you have a 1Gbit/s connection that is fully used, 1TB takes about 16 minutes to transfer. But this speed depends on som many different things.
There's probably not much you can do about the actual data transfer time.
Transforming this amount of data also takes time, of course. Sometimes it is possible to optimize transformations but it really depends on what you are doing in the transformation, and if there are any alternate ways that would speed up the process (for example: applymap() is usually faster than joining tables, but is only approporiate in certain situations).
There are multiple components in this. How fast is BQ able to deliver data - you could for example test how long it takes to do the same extraction within BQ and store the result in a file (or actually it will be multiple files with that amount of data) in a google storage bucket? That's an indication of how long the BQ processing time is.
There's also the fact that it takes time to transfer data between servers. Network capacity between BQ and the Qlik server is a factor. If you have a 1Gbit/s connection that is fully used, 1TB takes about 16 minutes to transfer. But this speed depends on som many different things.
There's probably not much you can do about the actual data transfer time.
Transforming this amount of data also takes time, of course. Sometimes it is possible to optimize transformations but it really depends on what you are doing in the transformation, and if there are any alternate ways that would speed up the process (for example: applymap() is usually faster than joining tables, but is only approporiate in certain situations).
Hi @henrikalmen,
I see the query takes around 5 minutes to excute over the BQ while extracting this data takes around 1 Hours.
During execution, I noticed that Qlik only extract 20000 records at a time and keep going.
Can I extract 100000 records per second or maximize the data extraction chunk size?
Thanks
Good questions! I don't know - I have som experience with BQ but it's a few years back and I never used it together with Qlik.
But it seems tjhat the Qlik Connectors for BQ has got a "Rows Per Block" setting, and a flag "Allow Large Result Sets". https://help.qlik.com/en-US/connectors/Subsystems/ODBC_connector_help/Content/Connectors_ODBC/Google...