Big table - qlik desktop crashes PC - CPU usage 100 %
I'm a new user of Qlik and it seems that it suits perfectly my needs !
But I experience a bug really anoying that prevents me from using Qlik desktop : I loaded a big fact table (7 millions rows) with 3 small dimensions and 2 quite big (time and geography), from postgresql.
When I tested with a filtered database (loaded it with filter to have only few rows, like 20k or so), it worked nice and I created the app i wanted.
When I tried to do the same with the complete database (new app created), it loads, but quickly after that, when I edit sheets or else in app, Qlik starts using 100 % of my CPU and RAM and freezes PC so much I can't even move my mouse and kill the software : I have to reboot my PC.
It seems that it happens to some other users (see that post), but it's an old post. Is there any solution now about this ? Frankly, with that bug, Qlik is useless for me, as I plan to work on several databases of that type…
I have only 8 Go RAM, and I know that's the minimum, but my crash does not come from functions like If ou Aggr, as it happens even with empty sheets. For example, it happens if, by mistake, I click on « selections » or « Informations » (see attached pictures, Qlik is in French, I don't know if it's the same in english version).
When it comes to performance, QlikView and Qlik Sense behave quite similarly, even though QlikView doesn't have tools like "Selections" or "Insights" that seem to be crashing on your machine.
Both of these tools are quite heavy. The "Selections" tool needs to build filters with all your fields and fill them in with all distinct values that your fields have. That could be quite heavy. The "Insights" tool needs to analyze your data in order to offer some automatically generated insights. It is possible that with your data and with your (minimal) hardware, these tools could be too "expensive" from the performance standpoint.
It's quite difficult to diagnose the issue by guessing what the problem might be, but it is unlikely that loading 7 mln rows of data, while following good habits of performance, would do it.
I can recommend that you analyze your document using the QS Document Analyzer tool:
Here is a Design Blog post that may shed some further light upon things, potentially, feel free to search around in that entire area, there is a lot of great information there and lots of examples etc. Oleg's advice is great as well, he has been doing this stuff as long as I have, but he is a lot better at the development stuff than am I! 🙂 Hopefully the following link may prove useful, but again, if you back up one level, you can search across the entire Design Blog site, and there are hundreds of posts in there.
To help users find verified answers, please do not forget to use the "Accept as Solution" button on any post(s) that helped you resolve your problem or question. I now work a compressed schedule, Tuesday, Wednesday and Thursday, so those will be the days I will reply to any follow-up posts.