Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
datanibbler
Champion
Champion

Detail question on STORE

Hi,

I would like to know: In the apps here, there is quite often a STORE in the form of

>> STORE * FROM ... INTO ... <<

This has obviously been working fine, I just never used the FROM in this statement - I always just used

>> STORE ... INTO ... <<

My question is, does this have any effect? Does this change anything about the statement? What about the speed?

Thanks a lot!

Best regards,

DataNibbler

12 Replies
datanibbler
Champion
Champion
Author

Hi Marcus,

I hope I understand your point correctly - I think that is all right: The original data from SAP is loaded "outside" all individual reports anyway, so it is still there - the transformation.qvw where I want to remove the unnecessary fields (where I then have to add data from NonSAP-subsidiaries from Excel) is specific for this one report - I go backwards from the datamodel.qvw which is somewhat of a lot of work as there are many fields which are not needed - did I say at some point that I dislike the >> LOAD * << 😉 . Still, it is possible that later transformation steps specific to this report might refer to something from this one. I will check that. This is the first transformation to run in this context, so I will have to search all the others, too.

Best regards,

DataNibbler

marcus_sommer

Hi DataNibbler,

in addition to my statement from above it might be helpful to use more granular layers within the environment. My OrderTables are for example already the second layer grabbing the transactional data from the first layer (in both happens only a few transformations - they are mostly to distribute the data) and the next layer extracted from them various mapings and aggregations which are then matched within the fourth layer and all the output qvd's from these layers fill the fifth datamodel-layer which only contain a few transformations (a few renames, autonumber and so on) and just match the different qvd's into the datamodels from which the reports are loading binary.

Nearly everything within the layer 1 - 4 are loading the data incremental and slicing the qvd's on a monthly level. This all is of course conceptual and in the developing a bit more expensive as a 2/3-tier architecture (which if it included any incremental methods mostly on te first layer) but afterwards I'm quite flexible to distribute and parallelize the various parts.

- Marcus

datanibbler
Champion
Champion
Author

Hi,

well, I won't mess with that too much for now. There are quite a few layers to begin with, and data is, as I understand it, loaded from SAP incrementally - although there is everything in Qlik which poses a bit of a performance-problem on the surface ....

There are the extractors on the 1st layer, drawing data from SAP and in some cases from Excel.

Then come a few global transformations (not report-specific).

Specific for each report, there are several transformation steps for different things, each one using the original qvd's (from the 1st and 2nd layer).

After all the transformation steps, there is a datamodel.qvw for each report, just drawing and combining all the transformed qvd's - the datamodel is finally loaded with a BINARY into the application.

I think some consultant came up with that structure as it was the same in the last company I worked for. I guess it

does have advantages and we can add more transformations if we need some.