Discussion board where members can get started with Qlik Sense.
I'm extracting a table from Oracle - 2.2 million rows, 10 columns, and the extract/load job in Qlik Sense took over 8 hours.
Using the Oracle SQL Developer tool, I downloaded the same data into a csv file in under 30 minutes, then loaded it into Qlik Sense in less than a minute.
Is there a page of tips to speed up the Qlik-Oracle connection? (ODBC via a 64-bit system DSN).
(I think I can probably get away with a single one-off load followed by daily differential loads, but it would be good to have an answer on hand).
Here is a document which will give you the best practices while using any connection.
Hope this helps.
Just for reference, I use ODBC connection to mysql database, and it only take about 8 mins to load 12m rows table with 12 columns.
So, I think it is really strange for your load to take 8 hours!!!
If it is not a temp network connection problem (as if the network is not stable, can also effect the load time), do you maybe have sythetic key in your load? I know this can effect the load time.
Interesting(!) Is your mySQL database on a local machine (my Oracle database is in a different country, so I expect it to take some time, just not 8 hours).
There is no synthetic key, and the extract is just a plain table - there isn't any aggregation in the query. It's just a plain "SELECT * FROM TABLE". The only difference in my two cases is the tool being used; Qlik is slow, extracting via the Oracle tool is fast.
There's a lot of good information in there, thanks. (It doesn't seem applicable to my issue, sadly, but there is enough in there to make it worthwhile reading for any use-case!)
No, my mySQL database is not in a local machine. But it is in the same network in my company. So it is faster than yours.
But I will say, if a load takes long time and the network is not that stable, then the load may easily take much longer time. I also encountered the case that the computer even seems hang there when it read a huge table with more than 50m rows. Because a huge table will also eat a lot of the memory. But I think this is not your case here.
You can try to first load less rows, to see whether it can be faster. And if it faster, then you can try more rows, and see when will meet the limitation. And then you can use .QVD to load the data separately. But if it is always slow as 8 hours for 2m row, which means 20mins for 100k, then I have no idea why.
Hope this helps
It's a different continent, but a reliable private network, so I expect some latency.
The only difference is the client I use. An Oracle client (Oracle SQL Developer) running on my machine here vs. Qlik Server running on the same machine here.
Any network effects would be the same for both cases.
Having your database in a remote country even if in same corporate private network is certainly not the best practice to connect with good performance when having millions rows to transfer.
Local Network Bandwidth saturation, Wan bandwith limitation, speed, stability, latency, micro-disconnections...fully explained your situation.
I recommend you better setup your QlikServer closed to your main Database infrastructure or to schedule Oracle data export file in csv or txt and download it to be imported regularly by your Qlik server.
Thanks - moving the database & Qlik closer together is not an option.
It's unfortunate that the Qlik performance is so much worse than the SQL Developer performance, but I'll have to learn to live with it.