Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi all
I have a Qlikview model which I have created. it is 500MB big. The model consists of 3 tables. Table one = 500k rows (13 variables)
Table two = 82 million rows (14 variables), Table 3 = 10 million rows (5 variables ).
All of the tables are QVD's stored on the client computer.
The tables take 40 seconds to load. The model has 2 tabs. The first tab has 5 graphs with 4 filters available.
Now when I try to create table graphs on the second tab Qlikview crashed all the time.
I have more than enough RAM (64GB) with Qlikview opened there's on average there's only 6GB in use and 10% CPU usage.
The model has no synthetic keys.
I am Qlikview 12 and I'm opening the file in server.
What would I need to check or change to prevent Qlikview from crashing every time I try working on this report?
the expression is SUM(Sales)
Please attached the file I am using. I have uploaded it without the data.
The expression I am creating is a simple SUM(SALES) expression.
On tap 2 there is already a chart that I have created which does a calculation using data from the two tables SUM(SALES)/SUM(LEADS), I was able to create the first chart graph, it is only when I'm trying to create the second graph that the application crashed.
Regards
Will look into it.
I dont see any data model nor data in your attached file. Can you re-upload. ?
if i load the data the file will be 1GB and the data is sensitive so i cannot share it. Is there a way around this?
Just a quick question here - can you try and reproduce the problem with a more recent version of QlikView? You seem to be using a relatively old version (11.2 SR15) whereas the most recent version of 11.2 is SR17 and the most recent version of QlikView is 12.2 SR6. I doubt it'd help, but it might be a good idea to try the most recent version just to make sure the problem isn't caused by a bug that's been resolved.
Insofar as testing the issue - try reducing the dataset (only loading the top N rows, or making selections and then reducing data to possible values). That should help you debug.
I could imagine that your key between your tables isn't really suitable. You said the key represents the customer but if you want for example to use any date/product dimensions or something similar within your objects the customer-key won't connect the dimensions respectively granularity properly. Beside wrong results it might lead to the creation of large cartesian tables - behind each calculation is a temporary table on which the calculation is performed - which might cause your problem.
To get further you could reduce the number of records (maybe within the debugger or with some FIRST prefixes for the loadings) and/or applying some selections within your application before creating/changing your objects. Both measure will reduce the used dataset for all the background-calculations and should minimize the risk of a crash and you could check if your data and logic returned the expected results.
- Marcus