Skip to main content
Announcements
Have questions about Qlik Connect? Join us live on April 10th, at 11 AM ET: SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
singhcv123
Contributor
Contributor

how to load 10 millions rows in qlikview from dw database

hi,

we are in the process of loading 10 millions rows data in qlik view.

number of columns are 52 in fact table.

more than 10 dimension table.

please suggest the right approach

5 Replies
Greg_Williams
Employee
Employee

Keep the design to a star schema.

Only keep the absolutely required dimensions and comment the rest out.

Create expressions in the script and avoid using the = sign in chart objects, instead make a reference call to the expression in the script.

If one of the dimensions is a date/time stamp, remove the time and reduce to separate fields year, month, day (reduces distinct values). Eliminate the time altogether if it is not going to be used in the app.

Eliminate as many distinct items as possible.

Do not use distinct in your calculations unless you must.

Avoid counting, instead, add in your load script the line 1 as <table name>_counter and in a chart use sum(??_counter) - as this will calculate faster.

It goes without saying (but I'll state it) - be sure to place the data into a QVD file first, then use that as the source to build your qvw from.

Hope this helps.

Greg

singhcv123
Contributor
Contributor
Author

thanks for reply.....

additionally,,,can we break the data in the small part ...like 1 million each so that we can run multiple fact load for same table .......

and from UI prospective.....

how to limit the data in the chart..........

with big volume of data ,,,there are significant amount of performance issue while showing data in the chart.........

Not applicable

very good summary, just one more thing, considering 10 million data QVD, it's better to make sure you read optimized qvd as it will much faster than non-optimized.

Greg_Williams
Employee
Employee

I am not sure what type of machine set up you have...for instance, are you using 256 GB RAM or 64? How many core processors? What are the speed of the processors? Is this on a vm image with other software running on the vm? How many concurrent users are hitting the data? I have gone against tens of millions of records with sub-second response times many times...your performance is dependent on many things...but it never hurts to add more RAM.

Optimized QVD loads > will load the data faster, this helps with reload times (as well as other factors).

Use variables in charts (avoid using the = sign). Consider using top N or bottom N. Avoid using pivot tables, straight tables, and table box objects on the same sheet. Keep the objects (like 'tables') minimized unless you must see the data all the time.

Hope this is Helpful to you.

Greg

Greg_Williams
Employee
Employee

If you are finished with this thread, to close it out, please mark as Helpful or Correct.

Thanks.

Greg