Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Loading records from MySql is not working with limit greaterthan 9000

Hi,

I have some tables MySql DB that has more than 7 million records. When I tried to load records (as Select * from Table;) with no limits, it is not even starting the execution (It just says connected to DB and freezes out even after one hour). And if I give limit as 9000 then it works perfectly fine. I am not able to understand where the problem is. And I asked DB team to check the Indexes of table. They said everything is fine on their side.

Could some one please help me with this issue.

Thanks.

6 Replies
marcus_sommer

I think there aren't enough ressources on your or the database-side and/or there might be also any issues with the driver. Beside them you could try to load all data within a loop over a certain criteria maybe year or company or similar and store them as qvd - and afterwards you could apply an incremental approach.

- Marcus

Not applicable
Author

Thank you for your reply Marcus, I tried to load the records within a loop script. The Loop script I used is similar to the below one.

Count:

SQL SELECT count(*) as Tot_Rows,

max(EMPLOYEE_ID) as Max_ID

FROM HR.EMPLOYEES;

LET vMax = Peek('TOT_ROWS',0,'Count');

LET vExit_Loop = Peek('MAX_ID',0,'Count');


Drop Table Count;

FOR i = 0 to $(vMax)

HR:

SQL SELECT *

FROM HR.EMPLOYEES WHERE EMPLOYEE_ID=100 + ($(i)*2);

Concatenate

SQL SELECT *

FROM HR.EMPLOYEES WHERE EMPLOYEE_ID=100 + ($(i)*2+1);

EXIT FOR WHEN ($(i)*2) = $(vExit_Loop)

NEXT i;

But this script is taking very long time to execute as it is loading single record every time.

Could you please suggest me some better ideas in implementing this load script.

Also, my exit statement is not working

EXIT FOR WHEN ($(i)*2) = $(vExit_Loop)

Could you please tell me what is wrong in it?

jonathandienst
Partner - Champion III
Partner - Champion III

I dont think that Marcus was proposing you load the data one record at a time - as you noticed, that is extremely slow. Use some grouping like Month. You have not provided any details of your data, os we cannot make detailed suggestions, but analyse your data looking for some grouping to break the data into smaller chunks (Month, Department, Region, EmployeeType, ....)

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
jonathandienst
Partner - Champion III
Partner - Champion III

SQL SELECT count(*) as Tot_Rows,

max(EMPLOYEE_ID) as Max_ID

FROM HR.EMPLOYEES;

LET vMax = Peek('TOT_ROWS',0,'Count');

LET vExit_Loop = Peek('MAX_ID',0,'Count');

Note that QV field names and variable are case sensitive, so vMax and vExit_Loop will both be null.

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
Not applicable
Author

Hi Jonathan,

Yes I know, Marcus didn't propose me to load one record at a time. But Based on my Ideology, I worked in this way. And I am sorry for not providing the details because I am not working anymore on that project. I hope you understand.

But that's a good idea of loading the data by grouping. I will try this with my samples.

Thanks,

Ashok.

Not applicable
Author

Thanks Jonathan. I got the mistake.