Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
ioannagr
Creator III
Creator III

Load table from Mysql in parts in SQL SELECT

Hi all.

I was thinking to load data from a very big table from MySql db in parts in SELECT part.

So I have

LOAD pk from Table1;

INNER JOIN

LOAD pk as pk;

SQL SELECT pk, year from Table2

where year='2020'

UNION ALL

SELECT pk, year from Table2

where year='2021';

 

 

Would something like that work? How do i do it with a loop in the SQL SELECT part?

7 Replies
Dalton_Ruer
Support
Support

Not sure what you are trying to accomplish by not just selecting everything from table 2 in a single query.  By doing a UNION ALL you are still going to build a giant result set on the MYSQL side. 

If you are trying to save processing in MySQL you could bring the data back in sets and concatenate the data inside of Qlik Sense instead. 

ioannagr
Creator III
Creator III
Author

@Dalton_Ruer  this table is a billion rows. I want to do hard delete incremental load, which by following Qlik Help routine, (https://help.qlik.com/en-US/sense/February2021/Subsystems/Hub/Content/Sense_Hub/LoadData/use-QVD-fil...) involves an inner join with the initial table calling it from the database straight.

I have figured out a way by splitting the tables of interest by years in qvds  for an initial load, but the hard delete routine calls for calling the table once more straight from the database and I have NO clue how to make it speedier. I'm looking into it for days, the load time is days... incremental load goes to waste.

Dalton_Ruer
Support
Support

Gotcha. Incremental Load makes a lot of sense but Qlik doesn't have a UNION ALL which is why I thought you were doing that in MYSQL. I understand your confusion and will go through what the post meant by INNER JOIN

If you already have your years broken out then I would handle the Incremental Load on a Year by Year basis per the normal scheme. 

Qlik now provides a MERGE to handle incremental loads more cleanly for integration with Change Data Capture, https://help.qlik.com/en-US/sense/February2021/Subsystems/Hub/Content/Sense_Hub/Scripting/ScriptPref... but it doesn't handle DELETES so you might as well stick with the known method. 

Typically for Incremental load you would:

Read the new information into a table we will call it MyTable

Concatenate the old information into the new information table using a 

WHERE NOT EXISTS    (the where not exists forces you to keep the updated values)

INNER JOIN - The Inner Join is done within Qlik. The concept is:

Read all of the Primary keys from the databases into a table we will call it MyTableKeys

Now we do an Inner Join within Qlik Sense, (not the db) between MyTable and MyTableKeys. Thus removing any values that no longer exist in the database. 

 

 

 

ioannagr
Creator III
Creator III
Author

@Dalton_Ruer i've read thoroughly your answer. I do as you said, but  inner join with myTableKeys , even if i just load the pks takes days! That's why I tried asking for union all, or reading in parts straight from the db.

 

 

Kindly look here as well, It's another post I made about this where you can see my code so far : https://community.qlik.com/t5/New-to-Qlik-Sense/hard-delete-incremental-load-on-very-large-table/m-p...

ioannagr
Creator III
Creator III
Author

@Dalton_Ruer oh, wait, you mean create a qvd with only MyTableKeys and inner join it with the above? Was it so simple?! 😱  Is that okay with the transactional data though? Doesn't this mean it will only get stored just once so if a deletion happens in data loaded after the qvd's creation will not get affected or recognized as deletion? Why Qlik Help proposes then reading straight from the db? 

 

So many questions. 😅 Thank you in advance for your time @Dalton_Ruer , I so enjoy learning.

Dalton_Ruer
Support
Support

No you don't store the table that is just the KEYs. You load it live each time you reload. You only store the MERGED data.

Load new data live

Concatenate to it from your QVD using where not exists

Load PK live

Inner join

Store the merged data back to the QVD

ioannagr
Creator III
Creator III
Author

Okay, I think we're so close to solving my problem that I have for SO long, please bear with me @Dalton_Ruer .

So you suggest

1) I load my new data from source table by years live as I have done 

2) concatenate with my qvd with where not exists

3) Load all pks from source table by years (again i suppose, else it takes aaaaaaaages) live 

4) Inner join with what? 🤔

5) Store as a final step.