Skip to main content
Announcements
Qlik Connect 2025: 3 days of full immersion in data, analytics, and AI. May 13-15 | Orlando, FL: Learn More
cancel
Showing results for 
Search instead for 
Did you mean: 
josephinetedesc
Creator III
Creator III

loading a large CSV document - and a lookup file - which method is faster?

Hi All

a.     I have made a QVD document which loads a large csv file of 1,352,205 records 

b.     I then load a much smaller table - this is the "lookup table" for the first file.  I use "join" and the large file becomes a large single file

When I make a selection in qlikview - it is quite slow and it shows the"loading tool"




I could load the 2 files separately and just let qlikview create a synthetic join.

My question is would be the faster load?  I think the join is marginally faster?  Any advantages either way?

Thank you

Jo

15 Replies
stabben23
Partner - Master
Partner - Master

Yes Jo, here is best practice to use TestSet & TestName as a composite key.

Have you checked that dateReceived is Connected With the same format?

marcus_sommer

It's not clear if a join will work properly but you could avoid the potential problems of joins if you used mapping: Mapping as an Alternative to Joining.

But for me it didn't look necessary to merge these tables - an association between them within the datamodel should work (even with a synthetic key which is technically nearly the same like a combined key) so that I think that the poor performance will be caused from your objects and expressions. How do they look like?

- Marcus

josephinetedesc
Creator III
Creator III
Author

Hi Staffan

this might be the second time I reply - I cannot see my previous reply.

dateReceived is shown as as a date with format dd/MM/yyyy however in the script creating the calendar as shown below it would be a number.  I am working on my local computer.  I have not checked what it looks like from the server.

Temp: 

Load 

               min(dateReceived) as minDate, 

               max(dateReceived) as maxDate 

Resident Pathology; 

 

Let varMinDate = Num(Peek('minDate', 0, 'Temp')); 

Let varMaxDate = Num(Peek('maxDate', 0, 'Temp')); 

DROP Table Temp; 

 

TempCalendar: 

LOAD 

               $(varMinDate) + Iterno()-1 As Num, 

               Date($(varMinDate) + IterNo() - 1) as TempDate 

               AutoGenerate 1 While $(varMinDate) + IterNo() -1 <= $(varMaxDate); 

josephinetedesc
Creator III
Creator III
Author

"The only time that you cannot use Applymap() instead of a join is if you want to fetch more than one single corresponding value (record) from the second table."

It also seems to mean that I will need to create 4 Applymap()

So my next step I think will be to check this out on the server ... see if it is slow for the users.

Thank you all for your help.

Jo

marcus_sommer

You could use concatenated fields within an applymap() and split them afterwards again with a subfield(), like:

map:

mapping load key, F1 & '|' & F2 & '|' & F3 from mapSource;

applymap:

load

     *,

     subfield(applymap('map', key, '#NV'), '|', 1) as F1,

     subfield(applymap('map', key, '#NV'), '|', 2) as F2,

     subfield(applymap('map', key, '#NV'), '|', 3) as F3

from source;

- Marcus

stabben23
Partner - Master
Partner - Master

"Outside this discussion" Then I learn something New, the concatenated applymap is a great tool for the Developer toolbox, thanks marcus_sommer