Qlik Community

Ask a Question

QlikView App Dev

Discussion Board for collaboration related to QlikView App Development.

Announcements
Our next Qlik Insider session will cover new key capabilities. Join us August 11th REGISTER TODAY
cancel
Showing results for 
Search instead for 
Did you mean: 
datanibbler
Champion
Champion

Detail question on STORE

Hi,

I would like to know: In the apps here, there is quite often a STORE in the form of

>> STORE * FROM ... INTO ... <<

This has obviously been working fine, I just never used the FROM in this statement - I always just used

>> STORE ... INTO ... <<

My question is, does this have any effect? Does this change anything about the statement? What about the speed?

Thanks a lot!

Best regards,

DataNibbler

1 Solution

Accepted Solutions
MK_QSL
MVP
MVP

* or FieldName is used when you don't want the full table into QVD or flat files.

Say you have below table.

Data:

Load Country, Year, Sales From ....

You want to store only Country and Sales then you can use as below.

Store Country, Year From Data into Data.qvd(QVD);


Below two lines will give you same output.

Store * From Data into Data.qvd(QVD);

Store Data into Data.qvd(QVD);

I never tried but the above two should give same performance.

View solution in original post

12 Replies
MK_QSL
MVP
MVP

* or FieldName is used when you don't want the full table into QVD or flat files.

Say you have below table.

Data:

Load Country, Year, Sales From ....

You want to store only Country and Sales then you can use as below.

Store Country, Year From Data into Data.qvd(QVD);


Below two lines will give you same output.

Store * From Data into Data.qvd(QVD);

Store Data into Data.qvd(QVD);

I never tried but the above two should give same performance.

View solution in original post

keerthika
Creator II
Creator II

Hi,

    I think that instead of using Store * From data.qvd using column names takes much faster to execute because in sql select (column names) from data.qvd takes less time to execute the result instead of using *.

datanibbler
Champion
Champion
Author

Hi Keerthika,

I will try and report back whether it is.

If nothing else, that way would have the advantage that any fields which are later not loaded anymore could eventually be left out of the STORE in the first place.

keerthika
Creator II
Creator II

Hi,

    Okay . you just try it and response me back.

datanibbler
Champion
Champion
Author

Well, I had no "clean-room-scenario" for testing, the STORE was via a network connection to a server path, so there might have been some interference, I'll have to test it again eventually.

I tested with a really big table - about 47 mio records, running QlikView on my local laptop and storing to the server.

=> It seems that the variation >> STORE * FROM << is indeed faster than the simple STORE, although I cannot think of a logical explanation, and the >> STORE [list of fields] FROM << (with all fields in for now) is somewhere inbetween.

keerthika
Creator II
Creator II

Hi,

    Okay. Thanks for reply! Have a nice day!

marcus_sommer
MVP & Luminary
MVP & Luminary

I just did a small testing to the various store-statements and would say that there is no significantely difference between them. The differences between the single store-statements are mainly caused by inaccuracies through the fact that now() didn't return milliseconds else only full seconds and that there are further I/O between the OS and the network/storage and a store might be queued in some way.

But between the network and a local storage is a big difference ...

Here what I did with a table with 2 fields and 10 M records:

t:
load
recno() * 2  as ID,
Date(makedate(2000) + ceil(rand() * 1460), if(mod(recno(), 2)=0, 'DD.MM.YYYY', 'YYYY-MM-DD')) as Date
//floor(makedate(2000) + ceil(rand() * 1460)) as Date
autogenerate 10000000;

let i = rangesum($(i), 1);
let start = num(now() - today(), '#.########', '.', ',');
store t into (qvd);
StoreTimes:
load time($(start)) as Start, time(now() - today()) as End, time(now() - timestamp($(start))) as Result,
'store table into'
as Type, $(i) as Iteration, 'SSD' as Storage, 'overwritten' as QVD autogenerate 1;

let start = num(now() - today(), '#.########', '.', ',');
store * from t into (qvd);
StoreTimes:
load time($(start)) as Start, time(now() - today()) as End, time(now() - timestamp($(start))) as Result,
'store * from table into'
as Type, $(i) as Iteration, 'SSD' as Storage, 'overwritten' as QVD autogenerate 1;

let start = num(now() - today(), '#.########', '.', ',');
store ID, Date from t into (qvd);
StoreTimes:
load time($(start)) as Start, time(now() - today()) as End, time(now() - timestamp($(start))) as Result,
'store fields from table into'
as Type, $(i) as Iteration, 'SSD' as Storage, 'overwritten' as QVD autogenerate 1;

if filetime('StoreTimes.qvd') > 0 then
load * from StoreTimes.qvd (qvd);
end if

store StoreTimes into StoreTimes.qvd (qvd);

drop tables t;

resulting in:

- Marcus

datanibbler
Champion
Champion
Author

Hi Marcus,

okay, I know that here there is something interfering, so my testing is not perfect, so I'll just take your results as granted. Thanks a lot!

That leads me to some more potential for optimization - I have a multi-tier environment. If, in the "datamodel.qvw", I have a number of fields which are dropped at the end, the first step to optimize would be to not load these in the first place - a second step would be not to store them in the preceding "transform.qvw", no?

Best regards,

DataNibbler

marcus_sommer
MVP & Luminary
MVP & Luminary

Hi DataNibbler,

yes, the most time would be saved if unneeded data aren't loaded instead of dropping them anywhere afterwards. On the other side it might help to load things on the right time and the right place.

For example I have splitted my orderline-data into 3 tables - data from OrderLineFirst do I need in most of all applications but OrderLineSecond/Third are only needed in a few and from a timing point of view less important applications (and there are fields included which I currently don't use). So are my important data small and fast and the not so important data could run within another time-frame ...

- Marcus