Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Performance issues - Slow load of document

Hello,

I have a SourceDocument which contains 220 million rows and has a size of 11GB (None compression)

This document is splitted out into almost 75 Document files, which one user only has access to.

This means the largest document i have is approx on 1gb - Most of them are on 100mb or so - And my expressions are not that complex.

But when the user wants to see the documents they are acting really slow - Too slow in my opinion.

My server specs are:

132GB ram

12 CORE AMD Opterib 2.1ghz

64bit Windows 2008 R2 server

What can be done?

10 Replies
marcus_sommer

How is your data-model - several fact-tables with a link-table or a single fact-table. Both approaches have advantages and disadvantages. Beside reducing unnecessary fields and rows could be a split from high-cardinality fields save RAM. Also section access caused longer load-times.

- Marcus

Not applicable
Author

Single fact table with only a mastercalendar attached to it.

marcus_sommer

Are there fields with many unique values like a row-id, a timestamp, serial-numbers or similar? Is section access in the application?

- Marcus

Not applicable
Author

There is no section access.

Yes i have a timestamp for each row. So there is many unique values. I also have an row-id for each row.

marcus_sommer

That's good - so you have a great potential to reduce the RAM requirement. Split your timestamp in date and time:

floor(timestamp) as date,

frac(timestamp) as time

If you needed your timestamp in an object you could simply make: date + time. Further I assume your row-id won't be used from users and served only development-reasons - I would it to comment out.

- Marcus

Not applicable
Author

Hi Thomas,

To clarify.

You have 75 different users accessing one document each?  Have you thought about using SECTION  ACCESS and having one document?

How much memory is being used on the server when you have issues?  How much CPU is being used?

Can you define SLOW?  Is it charts taking time to update, sitting with hourglass and progress bar, or the documents take too long to load?

In development when you have not split the document out to 75 pieces, is response ok?

The answers to the above will help others understand your issue and will therefore moer likely to offer help.

Also I totally agree with the other comments posted here about TIMESTAMP, each tends to be unique so they do not compress, however dates and times are relatively low in number in most documents.

Richard

Not applicable
Author

I have one masterdocument which is splitted out to 75 customer documents (only containg the one customers data which also is added to the document)

They access the document through a webticket.

No hourglass and progress bar on the charts. Its just slow to show me all the graphic. Eg. when i change sheet.

Response in clienttool is okay, no matter if i have not splitted my document or i have.

Peter_Cammaert
Partner - Champion III
Partner - Champion III

In the QlikView Design Blog, Miguel wrote a nice article about the impact of reducing unique field values on document size (both on disk and in RAM). These apparently simple changes cause an immediate performance increase. Well worth your time:

http://community.qlik.com/blogs/qlikviewdesignblog/2014/03/11/the-importance-of-being-distinct

shane_spencer
Specialist
Specialist

Thomas do you have a lot of charts on a single sheet? Multiple charts will take longer to update. You could use tabs to change between charts so only the one is shown at a time.