Our source system is SAP and we have more than 40 crore records in some of the tables and our Server configuration is Intel(R) Xeon(R) CPU E5645 @ 2.40GHz 2.39 GHz (2 Processors) and RAM 192 GB System 64 bit.
It is taking 1 hour to load 1 Crore records so if i load 40 Crore records it takes 40 hours to load the data can some body help me out with the best system configuration.
Another thing to think of would be segmenting. For example, do all users need to look at the data on the most detailed level or can you aggregate the data up for a dashboard/analysis level? Or does it maybe make sense to split the data into yearly .qvw files? Segmenting can be done in many ways and using Publisher the reduction can be automated. Using Document Chaining you can jump seamlessly from the dashboard or aggregated level to the segment you want to analyze.