Skip to main content
Announcements
Live today at 11 AM ET. Get your questions about Qlik Connect answered, or just listen in. SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Inserting large number of rows with Dynamic Update

Hello everybody,

My situation is as follows:

I have one main table with about 10-20M rows, and I need to assign these rows to (possibly overlapping) groups. To achieve this, I created a mapping table to store combinations of row keys from the main table and the group assignments. The table layout looks something like this:

Main:

LOAD RowNo() as Key,

     'foo' as value

     AutoGenerate 100;

Mapping:

LOAD 0 as Key,

     0 as Group

     AutoGenerate 0;

What I want to do is to allow the user to select any number of rows from the Main table and to map those to a group based on a variable value.

In SQL, I could do that by issuing a statment like this (assuming the table MainPossible only contained the values from Main in the current selection):

INSERT INTO MAPPING (Key, Group) (SELECT Key, "$(Group)" as Group from MainPossible);

I tried different variations to get this line running as a Dynamic Update action, but either QlikView doesn't support the nested select query or I was just unable to guess the right syntax.

My second try was to include the values to be inserted into the SQL statement itself by using the following statement for the Dynamic Update action:

='INSERT INTO Mapping (Key, Group) values (' & Concat(Key,',$(Group)), (') & ',$(Group))'

This solution works fine for smaller numbers of rows (e.g. 4000). I need to be able to do the insert with at least 10-20M rows however, and that's not possible with the previous solution in a reasonable time (I cancelled my attempt after several minutes of waiting).

Any ideas on how to solve this?

1 Solution

Accepted Solutions
Not applicable
Author

I did not find a way to realize the nested insert statement with Dynamic Update, but I was able to put together a resonably fast solution that uses two different insertion methods based on the amount of rows to be inserted.

For lower numbers of rows (I picked 5000 as the limit), I used a Dynamic Update action with concatenated INSERT statements as described in my first post.

For larger selections, I am exporting the selected Key values to a CSV file and append them to the Mapping table via a Partial Reload. This approach takes less than 1 1/2 minutes for 20M rows, which is sufficiently fast for my use case.

View solution in original post

3 Replies
alexmosq
Partner - Contributor III
Partner - Contributor III

Any luck yet?

Not applicable
Author

I did not find a way to realize the nested insert statement with Dynamic Update, but I was able to put together a resonably fast solution that uses two different insertion methods based on the amount of rows to be inserted.

For lower numbers of rows (I picked 5000 as the limit), I used a Dynamic Update action with concatenated INSERT statements as described in my first post.

For larger selections, I am exporting the selected Key values to a CSV file and append them to the Mapping table via a Partial Reload. This approach takes less than 1 1/2 minutes for 20M rows, which is sufficiently fast for my use case.

alexmosq
Partner - Contributor III
Partner - Contributor III

Good idea! That sounds like a reasonable approach.

Thanks