May I ask why you need to export 555,000 rows / 334MB to Excel. This is simply too much data for a straight export from the front end. I don't foresee many practical uses for Excel files that big (life is too short to wait while Excel loads/recalcs etc).
If you need a bulk transfer, use STORE in the load script into a csv file.
Otherwise, let the user reduce the number of records by making whatever selections and then allow the export, You can use conditional calculation to suppress the calculation of the table until the selections reduce the data to (say) 10,000 records or less. Also reduce the number of fields in the table, if necessary by splitting into 2 or more tables. If you are doing the export in a macro fired by a button, conditionally enable the button as well.
The expressions for the conditional calc/enable depand on your data model. For example, if you have a transaction ID, then you could use something like:
=Count(Distinct TransID) < 10000
Hi Jonathan & Khadeer,
Thank you guys for your prompt response. I learnt this trivial things and did apply them as well.
Now where my question stand is even though I reduce the number of records to some 28000 rows still I am having the similar issue.
there are no synthetic keys, only two expressions in the chart that too a sum function, no set analysis nothing.
A couple of general things to look for:
- Are you using a calculated dimension? These can be performance killers.
- Are all the fields used in the dimensions and expressions in the chart in the same table? In large data sets, spanning across too many tables can result in out of memory errors. Even more so if the tables are nt asscoiated (island or orphan tables).
- Are you using Sum(If...)) expressions? As long as the If condition can be evaluated outside, set expressions will perform far better.
If you post your dimensions/expressions (or your model if its not too large), we could be more specific.