Hi All,
I am running SQL QUERY ON QVW to fetch data from database and usung
store table into tablename.qvd;
the problem is it is not able to create qvd ,
time elapsed 55 min and records fetched 276,736
every time I am killing it by task manager and again try to load.
please provide useful solution.
Thanks
Anant
Have you tested the SQL query in another application (eg SQL MS or Toad)? How fast does it run there? Unless you are doing some complex calculations, at <300k rows per hour, the SQL query is very likely the limiting step, so your optimisation efforts should go there.
Hi Anant,
Can you share your data schema please.
Let's have a look into it.
Many Thanks
Karthik
Hi Anant,
Share your data and application we can do easily.
Or else
You can try this way.
for creating qvd use below query.
1. STORE Table_Name INTO (\Documents\QVD\Name_the_QVD.qvd (qvd));
Thanks
Nandu.
Please check or share ur log file of QVD Generator
OR
Try something like below approach:
Let vStart = now();
Let vReloadStart = now();
SET vSAP='KTX_100';
Directory D:\qvprod\qvd\SAP\$(vSAP)\;
SET vQVD = 'D:\qvprod\qvd\SAP\$(vSAP)';
$(Include=D:\qvdev\Includes\connections\$(vSAP).txt);
set vTab = 'USER_ADDR';
$(vTab):
select * from $(vTab)
;
LET vUSER_ADDR_LOAD_DURATION = Interval(now() - vStart) ;
LET vStart = now();
store $(vTab) into $(vTab).qvd;
LET vUSER_ADDR_Size = num(filesize('$(vQVD)\USER_ADDR.qvd')/1024,'00.0') & ' KB';
LET vUSER_ADDR_Records = QvdNoOfRecords ('$(vQVD)\USER_ADDR.qvd');
drop table $(vTab);
Thanks,
AS
I think you need to implement an incremental load-approach and starting with smaller data-slices from your database (with a where clause on periods, categories or similar) and after that you loading omly new or changed data. Here you will find some postings about incremental loadings and also how to load data from qvd's in a optimized way: Advanced topics for creating a qlik datamodel
- Marcus
Have you tested the SQL query in another application (eg SQL MS or Toad)? How fast does it run there? Unless you are doing some complex calculations, at <300k rows per hour, the SQL query is very likely the limiting step, so your optimisation efforts should go there.
Anant Dubey,
Is the location of DB is remote? Say for example if the database is in US server and you are running it from server in other country or other location, obviously the sql querying will take time.
Better to use remote desktop and run the sql to make the QVD.
If the location of DB and the location from where you are running is same and still it takes time..then you need to see the query..
check if you are using group by function or any cost effective query in the script.
Regards,
Siva
Thanks Jonathan !! , yes are right query contains very large complex calculations
although the records in final QVD is 4,23,000 it took 2 hours to make QVD for first time.
database people confirm this query has large calculation so it was taking long
time to run and fetch records .
Thanks
Anant
yse siva there were large caomplex calclations with lot of joins and group by.
Thanks
Anant
You might want to look at reducing the calculations on the server query and bringing some of them into the QVW for a speed increase.