Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a huge table of 120M rows (40 columns) postgresql that I am able to dump to CSV on the Qlik sense host by using 'psql \copy' in 17 minutes. However, using the default postgresql connector in the latest Qlik sense (2021 Feb sp1) it seem to take around 5 hours. Why, I am not sure, but I get the feeling that the postgresql connector is outdated or misconfigured.
The machine is 16core 80gb, the load query is as simple as it gets. Load *; select X,y,z,,, from table;
Anyone else using postgresql and Qlik sense with big data?
Can add that this occurs on Postgresql 13.2. I "solved" this by installing the ODBC driver, and configuring that with a larger fetch size, then connecting QS to ODBC instead.