Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all,
I am trying to pull data from oracle database into qlik. My primary table contains a million records and two more columns from two different tables have to be left joined into the primary table. The problem is those tables contains records more than 100 crores!
What is the most efficient way to create this load? Should i create a view in DB itself or let qlik do the work?
Help appreciated!
Regards,
Arpit
Creating a view is leaving it to the DB itself and sparing the qlik - would certainly be a more performant factor.
HI Arpit,
Try to create it into Qlik. Then only qlik purpose fulfill.
Like Extraction then transformation
I think you will get more control if you have all data uniquely.
Thanks,
Arvind Patil
Thanks for the quick response!
Is there any other way you can think of?
Thanks Arvind. This too makes sense. But the load will like go on forever.
If you want to do it in qlik, I would suggest to do ApplyMap rather than Join. Check this: Don't join - use Applymap instead
Okay. But the time it will take to load will be rather same right?
No, Applymap is supposed to be faster.
Hi Arpit,
Usually for huge amount of records ETL tools are used.
However given your situation, load the data from DB to Qlik and then store them to QVDs. Once you start working on QVDs the data/load to QVW will be lot faster.
However, do not use join in such situation. Use apply map or a link table/association to filter the data as required.
You can even break down the data to multiple table if the data is heterogeneous.