Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello all,
So I have 8 tables of a total of 0,5 Terra (also causing ram issues )which are all needed for an app.
I'm confident many of you have come across this problem, how have you handled it?
What can I do?
Thank you in advance,
Ioanna
I think you have 3 options:
You could also go through the fields and remove any not needed.
Hi @Jonathan_Dienst , thanks for replying.
Can you explain bullet number 3 for me?
Do you think I should use Hadoop or Spark?
I have witnessed Giga Tables crushing, I dont know what I should do with Terras 😕
Aggregate using Sum(), Max(), Avg() etc to have (for example monthly data instead of daily ).
Another bullet: Make sure any joins have properly aligned key field(s) between the two tables
It are not really many information - just to look at the number of records or the size of the rawdata is often not very meaningful how to handle and to process them within Qlik.
Most important is the datamodel which should be developed in the direction of a star-scheme. Of course only mandatory needed records and fields should be loaded and be without unnecessary formattings and further high cardinality fields like a timestamp should be splitted into dates and times and similar. Only if you have done this more or less roughly you will be able to estimate which resources are needed and if it would be useful to add further measures like a mixed granularity, various flags, certain pre-calculations within the script and so on.
Beside this you will probably need a multi-tier data architecture to implement one or several layers of incremental loadings.
- Marcus
I see thank you everyone. Very insightful information by all of you!