Hmmm, 3 and a half million rows shouldn't be a problem for QlikView. I'm currently working on a database with over 7 million rows in the main transaction table.
If the report has frozen overnight and eaten all the memory then my first thought is synthetic keys. If you have multiple fields in different tables with the same name QlikView will try to link them together and create a synthetic key of the common fields. This happens at the end of the load and if you're not careful can have exactly the same symptoms as you're experiencing.
Posting your load script here will help but if you don't want to do that then have a look through the tables you are loading and check for multiple fields with the same name in different tables; you can always rename the fields if you don't want QlikView to link them. Another way to try to get it to load is to run the script in debug mode and tick the 'Limited Load' tick box on the left and just load a handful of rows; that may help get it loaded to start with depending on how bad things are.
Hope that helps,
Hi Chris, Thanks for the response, unfortunate due to DPA I cannot share my extract, I can confirm tho there are no synthetic keys, duplicated field names or data islands that would typically snarl up Qlikview, its just very big. I was just wondering if there is any functionailty in Qlikview which can load it in sequential chunks somehow, I just think the server cannot cope with 28x Rows with 3.5 million columns.