Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am new to Qlik and have a question about a table I am trying to parse through. Our production data base has a Key/Value table with the following structure:
Id,
Category,
DateEntered,
Name,
Value
I need to extract some (20) of the key value pairs (based on name) so that I can join them to the rest of the data set on their Id. There are about 80 million rows in this key/value table and I need the transformation to happen very efficiently. I am already taking the table as is (no transformations) and saving it as a QVD. I am also using an incremental load to keep this table up to date. I intend to do this transformation incrementally as well.
My question is: what is the most efficient way to parse through this data set? My first attempt at this pulls the ids from another table (where they are the primary key) and then joining the key/value table one key at a time. However, this isn't very fast. Is there a more efficent way to do this?
Here is a sample of my script:
KeyValueParse:
LOAD
Id
FROM Table;
JOIN (KeyValueParse)
LOAD
Id,
Value
FROM Table2
WHERE Name = 'IPAddress';
JOIN (KeyValueParse)
LOAD
Id,
Value
FROM Table2
WHERE Name = 'FirstName';
...
The output table should look like:
Id,
Value as IPAddress,
Value AS FirstName
...
I was thinking perhaps I could pivot the data in the script but the searching I have done says that isnt an option.
You can use a The Generic Load or use For loop to do this....
You can use a The Generic Load or use For loop to do this....
Perfect. Thank you.
Are there best practices for an Incremental Generic Load? The generic table I am loading from has a date stamp and about 80 million rows. I am hoping to make this as fast loading as possible.
I have never done that before, but I think it should be same as any other incremental load. grab new data and use generic load to assign them to their new buckets