Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello QlikView addicts,
I have a relative large qvw (4 gb without compression) and a fact table with 120.000.000 records.
The reload duration of the qvw takes around 40 minutes.
To connect the dimension tables with the fact tables I use composed keys that are quite large. See the example below.
'$(vStoreRowPrefix)' & '-' & [Store No_] & '-' & [POS Terminal No_] & '-' & [Transaction No_].
It was time to created some modifications to the qvw so I thought I will also add Autonumbers to the dashboard.
Autonumber('$(vStoreRowPrefix)' & '-' & [Store No_] & '-' & [POS Terminal No_] & '-' & [Transaction No_], 'header')
The size of the qvw did became much smaller but that reload time is greatly increased!
The reload duration is now about 2 hours.
Is this a normal behaviour or am I doing something wrong?
Thanks in advance for all the help
Bill,
Can I send the logfiles to you mail address?
Hi Fabrice,
I did create a test with 3 different types of reload.
The load was based on 1 store
regular reload
reload with autonumber
reload with autonumberhash256
and this is my result.
regular reload Duration: 04:30 Size: 961.423
reload with autonumber Duration: 16.19 Size: 919.812
reload with autonumberhash256 Duration: 11:54 Size: 711.161
So yes the reload with autonumberhash is faster compared with the autonumber but still a lot slower compared to the regular reload.
On the other hand the size of the qvw decreased allot.
Jeroen,
Thanks for these informations.
AutoNumberHash256 is therefore quicker than the simple AutoNumber (by 30%, that's a lot), and the qvw is also smaller (that is a surprise for me)
Fabrice
Hi,
does somebody have an idea how I can solve this issue?
Thanks in advance
Jeroen,
One other idea: create your own numbers by loading first the concatenated key or different parts of the keys. I will say that I do not believe it will "save" you, but I have seen in the past that some "out-of-the-path" ideas were the good ones.
1) Applymap
MAPPING LOAD Distinct YourConcatenatedKey, ANumber
Into a 2 field table. So that you can use ApplyMap() later
But AutoNumber() functions are already sort of ApplyMap()
2) Join
LOAD Distinct DifferentPartsOfThekeys, ANumber
Perform a JOIN later (but with your number of rows, it will be very CPU intensive and memory consuming)
Drop the unused fields to keep only the number in your table
Fabrice
Hi Jeroen,
Could you post the rows (or even the load statements for all three) for autonumber as per the test above.
There is no need to add the '-' within the autonumber.for example as they are static.
Hashing in general is a faster operation than concatenation as autonumber is forced to do.
I am surprise by the dramatic difference in QVW filesize.
Can you validate that the generated autonumber for with and without hash contains equal amounts of unique values and that all fields exists in both qvw's?
Regards,
Hampus von Post
Hi Hampus,
Thank you for your response.
At the moment I'm traveling for work.
When I'm back in the Netherlands I respond to your questions.
Hi Jeroen,
have you tested using autonumber() without the label? Which result did you get?
We have a simmilar problem here:
load without autonumer(): 5 minutes
load with autonumber(): 1h 50 min
for 55.746.622 lines
Anyone else having such issues?
Tamara
I have the same problem. QV 11.2 SR15
~40 000 000 unique records
load autonumber(Field1&Field2) as Key - 37 minutes
load Field1&Field2 as Key - 3 minutes
Is the problem solved?
Hi Igor,
are you using a label in the autonumber function to create a private range?
I think it might be a hardware issue. Which CPU are you using?
Regards
Tamara