Skip to main content
Announcements
Qlik Connect 2025: 3 days of full immersion in data, analytics, and AI. May 13-15 | Orlando, FL: Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
leesider
Contributor
Contributor

SQL timeout on large query

We recently changed our Qlikview reports to a new connection string. There is  one report that was previously connected to the previous database since 2014. The report uses a query that returns almost 15 million rows. The resultant QVD file is over 2 GB in size. On the old connection it always executed but since we moved to the new one when I do a reload on the QVW file it runs for a couple of hours and then times out with the error GEN_LED_VOUCHER_ROW_UNION_QRY << GEN_LED_VOUCHER_ROW_UNION_QRY Error: QVX_UNEXPECTED_END_OF_DATA: SQL##f

The QVD file is not returning all the rows so the report is missing some data . What I don't understand is why it never timed out on the previous database but is timing out on the new one. The QVW file connects to the db by using an "include" file.

$(Include=..\environment settings\database connection string.txt);

Can I add a timeout to the connection string in that text file to make the default timeout longer? I am not even sure that the query will execute because when I do a select * on it from it in PLSQL developer it locks up my PC; it can't return the rows.

I have done a workaround by copying the QVD file from the old database into the folder where the one for the new database is and the data that is missing from the new db (because the query is timing out) is present in the Qlikview file that reads from the new db. However the problem with that is that QVD file doesn't have the data from the new database, it only has data from the old one. If I execute the Reload again will it append the data from the new db to the QVD file?

As you can see I have a few questions, it's a bit of a mess.

4 Replies
Marcos_Ferreira_dos_Santos

Hi Leesider,

What do you think about breaking your QVD in more than one file (maybe by year or another choice), and load them as xxxx*.qvd in your QVW ?

leesider
Contributor
Contributor
Author

That might work if I split that one big query into 6 parts, e.g. every year from 2014 to 2020.

But how do I get the QVW to load all six at the same time?

The QVW has this in the script
LOAD 

/FIELD NAMES

/FIELDNAMES

//...

FROM 
[path to qvd file]

 

How do I get it to load from 6 different qvds all with the same fields. If I do a join on all six it will slow it down.

Marcos_Ferreira_dos_Santos

Hi Leesider,

If you name them with a certain pattern (e.g: FILE2014.qvd, FILE2015.qvd, etc) you can reference them in the FROM section as "<path>\FILE2*.qvd". It should work, Qlikview engine is made to join them automatically).

leesider
Contributor
Contributor
Author

The query that is returning 15 million rows is called GEN_LED_VOUCHER_ROW_UNION_QRY.

It collects data from 2014 to 2020 so my idea was to make seven copies of this query for each year from 2014 to 2020 and in each one edit the query to only pull rows from each year from 2014 to 2020. E.g. GEN_LED_VOUCHER_ROW_UNION_QRY_2014, GEN_LED_VOUCHER_ROW_UNION_QRY_2015 and on up to GEN_LED_VOUCHER_ROW_UNION_QRY_2019

I created the queries and changed the Qlikview script to what is below. I just tried loading GEN_LED_VOUCHER_ROW_UNION_QRY_2019 and GEN_LED_VOUCHER_ROW_UNION_QRY_2020  rather than loading them all. However I got the error “GEN_LED_VOUCHER_ROW_UNION_QRY_2019 table not found

 

SQL SELECT *
FROM IFSAPP."GEN_LED_VOUCHER_ROW_UNION_QRY_2019";

STORE GEN_LED_VOUCHER_ROW_UNION_QRY_2019 into [..\QVD\GEN_LED_VOUCHER_ROW_UNION_QRY.qvd];

DROP TABLE GEN_LED_VOUCHER_ROW_UNION_QRY_2019;


SQL SELECT *
FROM IFSAPP."GEN_LED_VOUCHER_ROW_UNION_QRY_2020";

STORE GEN_LED_VOUCHER_ROW_UNION_QRY_2020 into [..\QVD\GEN_LED_VOUCHER_ROW_UNION_QRY.qvd];

DROP TABLE GEN_LED_VOUCHER_ROW_UNION_QRY_2020;

EXIT SCRIPT;

This is what the script was before and it did not give errors but as I have described it timed out because the GEN_LED_VOUCHER_ROW_UNION_QRY query returns 15 million rows.

 

SQL SELECT *
FROM IFSAPP."GEN_LED_VOUCHER_ROW_UNION_QRY";

STORE GEN_LED_VOUCHER_ROW_UNION_QRY into [..\QVD\GEN_LED_VOUCHER_ROW_UNION_QRY.qvd];

DROP TABLE GEN_LED_VOUCHER_ROW_UNION_QRY;